-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build wheels for cyipopt #41
Comments
I'm not familiar enough with wheels to understand how ipopt would be incorporated. My guess is wheels for ipopt need to be created first. I know that, for example, vtk can be installed via pip. It is a C++ library and has wheels for the different platforms: https://pypi.org/project/vtk/#files. Regardless, I don't really know how to solve this other than waiting for someone to build ipopt wheels. |
Here is a relatively new ipopt Python wrapper that provides wheels: https://gitlab.com/g-braeunlich/ipyopt. Could be used as a reference. Note that it only provides linux wheels. They seem to use a manylinux docker image that already has ipopt installed: https://gitlab.com/g-braeunlich/ipyopt/-/blob/main/.gitlab-ci.yml#L1 Worth noting that this is a fork of https://github.com/xuy/pyipopt |
This may be helpful: https://cibuildwheel.readthedocs.io/en/stable/ |
I found this talk about building wheels. It includes some things about linking to shared libraries and bundling those libraries: Slides are here: https://ep2021.europython.eu/media/conference/slides/5gVwmkx-a-tale-of-python-c-extensions-and-cross-platform-wheels.pdf One key thing to determine is whether we'd have to bundle an ipopt binary with the wheels for each platform. |
This shows that you can build a wheel against a local shared lib and then use auditwheel to fix the hardcoded paths to make it work for end users on other platforms: https://stackoverflow.com/questions/23916186/how-to-include-external-library-with-python-wheel-package Edit: "Note: auditwheel is Linux-only. For MacOS, see the delocate tool." |
If we somehow bundle Ipopt, then we'd be distributing Ipopt and need to look into the license requirements. |
Here is a demo for poetry to build a wheel that depends on a shared lib: https://github.com/riddell-stan/poetry-install-shared-lib-demo The demo is only for linux. |
It looks like cvxopt is a good model to follow. It builds wheels for all three major platforms by first installing the build dependency shared libraries using conda and then uses cibuildwheel to build all the wheels. See this github action file: https://github.com/cvxpy/cvxpy/blob/master/.github/workflows/build.yml Reading through their setup makes me think we could follow the same pattern and then bundle an ipopt install with our wheel. Their https://github.com/cvxopt/cvxopt-wheels could even be a model for creating wheels for ipopt itself. |
Ipopt is currently release under the eclipse 2.0 license. We are using Eclipse 1.0. So we'll need to ship their license in the released binaries. I guess this goes for all of the other dependencies that get bundled in the wheel. We need to investigate how this is handled. |
Oddly, Ipopt changed their license seemingly without approval of all contributors here: coin-or/Ipopt@19e1b26, but maybe contributors sign away their copyright to COINOR or something. The eclipse license has some interesting features: https://en.wikipedia.org/wiki/Eclipse_Public_License but in the wikipedia article is says that linking against ipopt doesn't constitute a derivative work. So I think we can distribute Ipopt in our binary as long as we include both our license and Ipopts. There is explanation that the eclipse license isn't compatible with GPL. We should check what the wheel ends up including. |
Actually the wikipedia articles says: "The Eclipse Foundation advises that version 1.0 is deprecated and that projects should migrate to version 2.0. Relicensing is a straightforward matter and does not require the consent of all contributors, past and present. Rather, the version 1.0 license allows a project (preferably after forming a consensus) to adopt any new version by simply updating the relevant file headers and license notices." So I guess we can upgrade our license too. |
This is a related and informative page discussing (among other things) vendoring dependencies in wheels: https://pypackaging-native.github.io/ |
I managed to produce This can be done by downloading this script, placing it in an empty folder and then running (while in the newly created folder): docker run -v $(pwd):/wheels --rm --platform=linux/aarch64 quay.io/pypa/manylinux_2_28_aarch64 /bin/bash -c "ln -s /opt/python/cp311-cp311/bin/python /bin/python && ln -s /bin/python /bin/python3 && bash /wheels/build-cyipopt-wheel.sh" for docker run -v $(pwd):/wheels --rm -it --platform=linux/amd64 quay.io/pypa/manylinux_2_28_x86_64 /bin/bash -c "ln -s /opt/python/cp311-cp311/bin/python /bin/python && ln -s /bin/python /bin/python3 && bash /wheels/build-cyipopt-wheel.sh" for The generated wheels appear in the folder that the script was placed, and include any shared libraries needed for This uses pypa/manylinux and auditwheel (mentioned above in this thread). cibuildwheel looks like a great way to the same in CI. |
Thanks. So it seems that, at least for linux, we need two things: constraining to a numpy version and repairing the wheel with auditwheel, if I understand correctly. Is the constraint to NumPy necessary? Or can it work for a set of ABI compatible numpy versions? |
I also came across this just now. Maybe conda can build wheels: https://docs.conda.io/projects/conda-build/en/latest/resources/define-metadata.html#output-type |
It's unclear to me. I constrained it to be more explicit - otherwise the latest version would be used. I know that Do you choose a version of |
I’m not very familiar with conda, but I guess it sounds like a great alternative, to avoid adding more workflows in CI. Alternatively, I suppose if there were wheels for |
Conda forge builds against a set of ABI compatible NumPy versions. See https://conda-forge.org/docs/maintainer/knowledge_base.html#building-against-numpy, so I'd like to utilize that so we can support back to 1.19 or so with one wheel.
I think the idea is that the wheels built with conda are then hosted on PyPi for general pip installation. We wouldn't publish anything new in conda forge. |
I meant that if there was a (non-conda) CI job to generate and push wheels to PyPi, then perhaps there wouldn't be a need to have any conda-related job in CI, nor publish anything in conda forge? |
That can't really work because conda binaries don't vendor dynamically linked libraries. The best route to maximize compatibility on all OSes, is to setup a conda environment with all dependencies and build cyipopt against that compatible set. Then you can build a wheel that sucks in those compatible dependencies for all combinations. If we try to go the other way around, we have to manually build all of the dependencies for each environment in cyipopt's CI toolchain to generate all the combinations. But that's exactly what conda forge already does for us and it took us many years to get consistent and compatible IPOPT/BLAS/Numpy/etc binaries for all platforms. It would be a bit crazy to reproduce that work here for a single package. |
Oh well, package management in Python is such a monster 🐍 😄 |
On ABI compatibility I think having oldest-supported-numpy instead of numpy in pyproject.toml might be better. I can open a PR if you want |
If you want to create a PR that builds a linux wheel, as you've shown, against a specific set of binaries and the oldest supported NumPy I will take that and we can at least push one or more wheels to PyPi. I'd prefer if it built from source release tarballs (ipopt, etc.) than from git checkouts for the specific versions you select. This will at least enable some binary installs using pip. Ideally this is only built for tagged versions in our git repo here and stored as an artifact that I can then upload to PyPi. I guess we can manually build one for the last release. How does that sound? If you don't want to do an action, then we can just add some version of your script to the repo for manual use. That's fine too. |
One more thought: for openblas, I typically build against the netlib version of blas so that users can dynamically switch their blas implementation. Do we have to package openblas in the wheel? Or would it be fine to avoid that. It would be interesting to know if numpy wheels include it. If so, can we rely on whatever NumPy supplies? That should always be present since we have NumPy as a dependency. |
$ ls -lh ./numpy.libs
-rwxr-xr-x@ 1 work staff 2.6M 5 Feb 17:18 libgfortran-040039e1.so.5.0.0
-rwxr-xr-x@ 1 work staff 31M 5 Feb 17:18 libopenblas64_p-r0-15028c96.3.21.so
-rwxr-xr-x@ 1 work staff 242K 5 Feb 17:18 libquadmath-96973f99.so.0.0.0
My understanding is that this would be non-standard and unsafe for pypi wheels. We would have to make sure that whatever BLAS version Ipopt was compiled with is compatible with the one of the
Relavant discussion here
I am curious about how you make sure that this is safe. Also a minimal example about how you would do the dynamic switch would help, if possible. |
I think if we are to do building of wheels in CI, then we should use the
I can open a PR to add this to the documentation, e.g. on installation/building |
On this I meant opening a PR just to update |
If you dynamically link against the netlib version when building you can use switch blas implementations by using your package manager to update what blas everything should point to. It may also work if you link against any version of blas. We do this same thing in conda, you link against the netlib version and then you can conda install/update other blas implementations (netlib/atlas/mkl/openblas) to switch. This shows how switch occurs in debian: https://wiki.debian.org/DebianScience/LinearAlgebraLibraries And conda: https://conda-forge.org/docs/maintainer/knowledge_base.html#switching-blas-implementation |
It wouldn't be serving a user. It would serve the maintainers of this package. We can run the script manually after a source release to generate at least the linux wheel built against the latest dependency set (and the oldest support numpy). |
Sure, that's fine. |
Thanks for investigating the numpy scipy openblas situation. If all binaries include their own openblas and things work together, we can go with that. |
For openblas we may need to rename it: |
We also need to check the licenses for everything we bundle in the wheels and make sure that those licenses are followed (probably including various licenses in the wheel). |
I am not experienced with licenses, but I have listed all bundled libraries in the wheel at the description of #189. Let's continue the discussion there. |
It would be useful for people to install via pip without having to compile. We can build wheels for various operating systems to do this.
The text was updated successfully, but these errors were encountered: