-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Handle meson based python-wheel #6454
base: master
Are you sure you want to change the base?
Conversation
Note that this is not ready for testing yet... still early WIP, simply ported in a new PR my pending changes from my local branch. |
additional info:
|
Yup, on my radar, will add that indeed. Although I wonder if numpy 1.26.x might still work with older DSM considering the new meson I'm trying to build-up.
That should be the case from
The per depend fully-generated meson cross-file should now fix that. I had encountered that same issue with cmake long ago where the
Long story short, the meson had never received such enhancement as things we're working just fine. But that's no longer true with python wheels whereas with the python virtual environment things gets totally confused for meson. Thus the need to have a fully functional meson cross-file defining all library and include path properly. have a look at current This is now mostly working with meson, still have a few things to go through but getting there.
That's a current known issue with numpy. We need to force settin the long bit accordingly for
EDIT: with regards to cython (which I just hit) there must be an issue with the PATH although there may be a way to set them in the meson native file such as:
Anyhow, as usual thnx for your feedback, and work slowly progressing on this. EDIT2: It turns out that I have yet to empty the env now when invoking meson build, which is not the case yet. Although it does work for regular meson builds it wont for python-based meson wheel builds. next on my todo. |
@hgy59 having a proof of concept that does build sucessfully for both aarch64 and x64 using latest numpy 2.2.3. Although struggling with armv7 and evansport... I'll check if I can make 1.26.4 to work instead for the moment. |
@th0ma7, was looking at the errors which remain:
And found the following which may be useful if you've not already considered:
Hope they can assist... |
I believe I now have something functional, but unmaintainable as-is. The goodUsing normal
I can now sucesfully cross-compile for arm7, evansport and x64 for DSM-7.1 The badThere is a know bug in gcc<=10 with aarch64 that makes the compiler to segfault. I tried pretty much every possible alternatives of flags/disabling in the code but I wasn't able to workaround it. The uglyPart of my crusade at making meson-python build to work, I ended-up at one point to reproduce the normal meson+ninja build. Surprisingly, this ended-up allowing me to sucesfully build numpy for aarch64... Missing is then the (re)packaging in wheel format, which hapens to be the exact same process as "The good" as long as I re-use the exact same builddir (which I ended-up figuring out tonight). This really is ugly, but does work. This last commit a2068f5 was not tested on previously working x64, evansport and armv7. I'll let this rest for tonight. Good news is, we're probably much closer now... just need to tie-up the remaining loose-ends. |
@th0ma7 the wheels created from python/*/Makefile are not yet added to The |
Thnx for catching this up, will include this. I'm also looking at how to install numpy in the crossenv... I got an idea on how i could reuse the newly cross-compiled numpy wheel so it gets installed into the cross portion of the crossenv so it can then be made available for other wheels that depends on it. Lastly, also looking at adding flexibility to have different vendor managed meson (other than numpy use case where the source package provides its own modified meson.py) and skipping that meson+ninja part when no vendor managed meson is provided (i.e. being the default use case) All in all, taking shape but will require a few more spare cycles before reaching the finishing line... |
@th0ma7 another small issue popped up:
The original wheels in the index (pypi) are cross compiled (like |
@th0ma7 I have successfully built python311-wheels with added python/numpy and python/numpy_1.26 for aarch64-7.1 and armv7-7.1. It would be interresting to validate whether such wheels created with gcc 8.5 will run under DSM 6. I guess if the *.so files within the wheels do not reference GLBC > 2.20 functions, it might work. My background: I am trying to build a final homeassistant package with support for DSM 6. This will be homeassistant 2024.3.3 that depends on numpy 1.26.0. This version is available in the index for x86_64 and aarch64 only, and I will have to build it at least for armv7 and evansport (i686). To support armv7 in homeassistant 2025.1.4, it will be |
Maybe this is similar to msgpack where it can fit in both? |
That's a long shot! Not sure how i can help you though. I could reinstall my armv7 using a 6.2.4 image to try it out if that helps? |
I got some pretty cool code locally that allows installing cross-compiled numpy wheel into the crossenv to allow building scipy and others... But I faced one major major major problem, gcc version. For @hgy59 All in all, this would require bumping our minimal version to DSM-7.2. EDIT: I'll sleep on it... and probably upload my new code online to safeguard it just in case even though it will fail to build. |
Good news, I was able to create a workaround patch for aarch64 ... a few loose ends but looking much better now. |
@hgy59 and @mreid-tt It may look like stagnating but after spending numerous hours on this I finally made a major leap forward which now allows using default This has definitively been taking way longer than anticipated but I believe things will now start to shape nicely 🤞 |
Sorry, I have only x86_64 for homeassistant 2025.1.4. |
Sorry for being cryptic... I meant, on what arch are you testing numpy wheels onto? I was presuming x86_64... and its the case. I noticed something when cross-compiling numpy for x86_64 on x86_64 is that it doesn't detect the CPU type properly and enables wrong optimizations. You'll notice errors such as:
I've been looking into working around that and a way to do so is adding this:
Which will then output the following:
@hgy59 sorry, what do you mean without "native" libraries? Are you referring to
Unless you refer to
Sorry but I need a bit more explanation... Unless you are referring to these from comment above #6454 (comment):
|
@th0ma7 found some useful information at https://numpy.org/doc/stable/building/redistributable_binaries.html IHMO we have to build the native libraries separately and use |
@th0ma7 there other build informations for numpy, like that a numpy specific version of meson must be used (on the next page https://numpy.org/doc/stable/building/understanding_meson.html). |
@hgy59 I'll definitively have a look at those links. If you don't mind reading my previous post #6454 (comment) to confirm we're referring to the same things. As far as meson, numpy's internal build system uses the meson provided with the sources under the As for said files, assuming when talking about "native" you refer to:
Both
I assume they are available from our sysroot at compile time with the regular lib directories passed:
The only thing may be that they are missing on the default synology image and would require us to package them somehow... and part of the numpy wheel could indeed be a way to go. Leaving only
All in all I'm not sure anything else is missing?!?! TL;DROne thing we may have to look at are Please let know if my theory doesn't make sense?!?! In the meantime I'll test this up on my armv7 and x64 NAS and see. |
As per @hgy59 it hapens that numpy/scipy is not working as expected. After further looking into that, it hapens that libgfortran.so and libquadmath.so part of GCC are not installed on the DSM. Therefore similarly to libatomic.so (re)packaging, these two extra libraries are now also being checked. Below output from x64-7.1 build: ===> SEARCHING for libatomic.so ===> Found in bin/my_print_defaults for library dependency from toolchain (libatomic.so) ===> Add library from toolchain (libatomic.so.1.2.0) ===> Add symlink from toolchain (libatomic.so libatomic.so.1) ===> SEARCHING for libquadmath.so ===> Found in lib/libopenblas.so.0.3 for library dependency from toolchain (libquadmath.so) ===> Add library from toolchain (libquadmath.so.0.0.0) ===> Add symlink from toolchain (libquadmath.so libquadmath.so.0) ===> SEARCHING for libgfortran.so ===> Found in lib/libopenblas.so.0.3 for library dependency from toolchain (libgfortran.so) ===> Add library from toolchain (libgfortran.so.5.0.0) ===> Add symlink from toolchain (libgfortran.so libgfortran.so.5) ===> Found in scipy-1.15.2-cp312-cp312-linux_x86_64_pc_linux_gnu.whl for library dependency from toolchain (libgfortran.so) ===> Add library from toolchain (libgfortran.so.5.0.0) ===> Add symlink from toolchain (libgfortran.so libgfortran.so.5) Below output from armv7-7.1 build: ===> SEARCHING for libatomic.so ===> Found in bin/my_print_defaults for library dependency from toolchain (libatomic.so) ===> Add library from toolchain (libatomic.so.1.2.0) ===> Add symlink from toolchain (libatomic.so.1 libatomic.so) ===> SEARCHING for libquadmath.so ===> SEARCHING for libgfortran.so ===> Found in lib/libopenblas.so.0.3 for library dependency from toolchain (libgfortran.so) ===> Add library from toolchain (libgfortran.so.5.0.0) ===> Add symlink from toolchain (libgfortran.so.5 libgfortran.so) ===> Found in scipy-1.15.2-cp312-cp312-linux_arm_unknown_linux_gnueabi.whl for library dependency from toolchain (libgfortran.so) ===> Add library from toolchain (libgfortran.so.5.0.0) ===> Add symlink from toolchain (libgfortran.so.5 libgfortran.so)
Theory to preliminary conclusions:
And... not boom but rather bang.. ? Next?Either
I have a strong preference with 1) ... Current state of affairs:
|
Disconnecting for tonight... but I have a strong feeling I'm inches away from a solution... And I think the issue remaining may be how I translated the LDFLAGS to meson (auto-generated under
@hgy59 feel free to pursue, fresh brain on this would be appreciated :) |
@th0ma7 I use a diyspk/nump-wheel package and include the numpy_test.py shown above. Added a service-setup.sh with
running on virtualdsm:
I can't find any binary that depends on libopenblas within the wheel, so I guess it is dynamically loaded and might have a specific search order. running with explicit library path
the AVX optimization seems not supported in virtualdsm. The definition of LD_LIBRARY_PATH is not a problem for the HA package (it already has it). to fix this, the rpath must be fixed/adjusted in the so files of the numpy wheel.
and both have |
@th0ma7 the above test works on DS-115j (armada370 - armv7) with DSM 7.1 when using LD_LIBRARY_PATH. |
@hgy59 would you mind commenting out the cpu-dispatch and cpu-baseline definition in numy's baseline I've added yesterday and retrying on you x86_64? and removing avx and re-test? I'll have to read further on this to get a proper understanding on how to using that (away from my build system atm). Also, I'm almost certain that the rpath is not functional atm, not only for meson-python wheels but probably overall for all meson builds. And that issue is new with this PR. Once that is fixed the LD_LIBRARY_PATH should not be needed. |
@th0ma7 my test even works on DS-218 (aarch64) with DSM 6.2.4 when patching BTW the |
working on it locally... Update: |
- most x64 archs are atom like - adjust cpu-baseline in python/numpy* (make it apollolake compatible)
This is really nice! Good work! What would be even better is the ability to completely avoid that altogether... but I doubt we can. Now let's find the issue with rpath... |
@th0ma7 my findings are, that we have to setup meson with --prefix for correct rpath, but I can't find where (or how) meson setup is called for our python-meson builds.
it is also documented for numpy on https://numpy.org/doc/stable/building/understanding_meson.html |
Aaahhh! Thnx for the pointer, i believe i know how to fix this later tonight! |
I did some more tests but didn't succeed. (added The so files in the builddir are binary the same as in the whl file. EDIT: |
I may have something that works now. The
Still, I don't get the rpath depth 2 neither the equivalent to As expected, the Now, I did try going all-in with a systematic For fun, have a look at
I recall struggling on this when adding cmake & meson long ago... Now I recall why a bit more. |
@@ -84,21 +87,24 @@ $(MESON_CROSS_TOOLCHAIN_PKG): | |||
meson_pkg_toolchain: SHELL:=/bin/bash | |||
meson_pkg_toolchain: | |||
@cat $(MESON_CROSS_TOOLCHAIN_WRK) | |||
@echo "pkgconfig = '$$(which pkg-config)'" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@echo "pkgconfig = '$$(which pkg-config)'" | |
@echo "pkgconfig = '$$(which pkgconfig)'" |
BTW the build log of openblas is flooded with warnings when building numpy only in diyspk:
This comes from It occurs while A pragmatic solution would be to create the include folder in pre_compile, since modules with dependencies require it. .PHONY: openblas_pre_compile
openblas_pre_compile:
install -d -m 755 $(STAGING_INSTALL_PREFIX)/include A better solution would be to create this folder in the Makefile that defines the include path. |
It now works now without LD_LIBRARY_PATH 🎉 .
|
Yes finally! I'm starting to wonder if this isn't an issue specific to numpy's vendor meson.... While functional it still needs to be fixed somewhat though. |
@hgy59 and @mreid-tt I haven't count how many times I've build numpy (and it's way too many), but I'm now almost certain this is a bug with meson. I've documented my findings in mesonbuild/meson#14354 and believe this relates to a long-standing bug at mesonbuild/meson#6541 I do have code to workaround that, enforcing to have Slowly pursuing... 🐢 🐌 |
Description
Handle meson based python-wheel
Fixes:
Checklist
all-supported
completed successfullyType of change