-
Notifications
You must be signed in to change notification settings - Fork 751
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some Pool Options outSize datatype is not correct , maybe should set as LongOptionalVector #1586
Comments
has these bug is these layer FractionalMaxPool2dImpl FractionalMaxPool2dOptions only receive the first value , @saudet we need solve these bug before javacpp-pytorch 2.6 release |
let us check the javacpp raw code in scala
console log
|
def main(args: Array[String]): Unit = { |
javacpp-pytorch Fra..Pool
console log error
|
console log error
|
@saudet now I have supply raw javacpp-pytorch raw code with these pool layer , these layer are have bug ,please solve these bug ,thanks |
maybe LongExpandingArrayOptional, DoubleExpandingArrayOptional these class have some bug |
Sounds like toNative isn't able to set the kernel_size properly for some reason. Please try to set the values manually |
it really bug ,I could promise . the code I think you should try run once, the console log you could view set the raw value is difference from options get real value ,this is pure javacpp code ,not use toNative and so on , AdaptiveMaxPool3dImpl , AdaptiveMaxPool2dImpl,AdaptiveAvgPool3dImpl,AdaptiveAvgPool2dImpl thanks @saudet the pure javacpp code
console log
|
HI , @saudet the FractionalMaxPool2d FractionalMaxPool3d can not work maybe just bad operate options unset correct value , |
HI @saudet ,adaptiveMaxpool2d the output_size really bug ,please check , because the output_size second element can not set value ! it become Long.MaxValue ! 216232169515805804 ,if you could set ,paste the correct code ,thanks
console
|
Please try to set the "org.bytedeco.javacpp.nopointergc" system property to "true". |
console log
|
You'll need to allocate memory for kernel_size and output_size for this to work |
how to do this ,I don't know this ,please show me example |
@saudet now I have test javacpp-pytorch all maybe almost 89 layers ,now only [ AdaptiveAvgPool3d, AdaptiveAvgPool2d, AdaptiveMaxPool2d, AdaptiveMaxPool3d , FractionalMaxPool2d, FractionalMaxPool3d ] six pool can not work ! them really have bug for the options to collect the input ,parameter ,and return type are fault . if you really think ,there no bug. please show me runnable code for them .I really unknow how to do anythings. if fixup the six pool ,the javacpp-pytorch will really for more use in jvm , I have extend Storch[scala-javacpp-pytorch] , it is more convinient for all thanks. |
|
Sure, there's some there's some working code at issue #1250 (comment) |
You're talking about FractionalMaxPool2d, not AdaptiveMaxPool2d, right? Yes, it does look like LongExpandingArrayOptional is wrong. It's for a 1-element array instead of 2-or-more-element array. I guess someone will need to fix this... @HGuillemet ? |
adaptive_avg_pool2d is in functional package ,I not sure is really invoke AdaptiveMaxPool2dImpl AdaptiveMaxPool2dOptions ? |
yes , just FractionalMaxPool2d, it only accept one element, we need two elements, |
@saudet there are is just you say runnable adaptiveAvgPool2d code
but but but, we normal need pass AdaptiveAvgPool2dOptions instance to AdaptiveAvgPool2dImpl
though you say it work ,but it not normal, other layers all could pass LayerOptions, but AdaptiveAvgPool2dImpl AdaptiveAvgPool3dImpl AdaptiveMaxPool2dImpl AdaptiveMaxPool3dImpl can not as normal work .so it is really bug ,we need fixup them , I think @sbrunk also really though |
I really can not understand ,why AdaptiveAvgPool2dImpl AdaptiveAvgPool3dImpl AdaptiveMaxPool2dImpl AdaptiveMaxPool3dImpl so expecially ,? it is hard to compile from cpp to java or something |
HI ,
some pool layer can not use ,AdaptiveMaxPool2d, AdaptiveMaxPool3d AdaptiveAvgPool2d AdaptiveAvgPool2d ,these need pass outputSize maybe tuple2 or tuple3, but the
2d
public native @cast("torch::ExpandingArrayWithOptionalElem<2>*") @ByRef @NoException(true) LongOptional output_size();
3d
public native @cast("torch::ExpandingArrayWithOptionalElem<3>*") @ByRef @NoException(true) LongOptional output_size();
so we will meet error
but I also scare it can not work, please check the code ,thanks
The text was updated successfully, but these errors were encountered: