-
-
Notifications
You must be signed in to change notification settings - Fork 505
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Numpy pinning going forward #4816
Comments
Reading this, I see the drawback, that we will have an activation script with |
It's perhaps possible to do this without an activation script, that was just the first thing that came to mind...
I don't have a strong argument (or preference) here. But whenever we get to numpy>=1.25 as a default, we'd IMO have to adapt the run-export. It would also be a bit weird to jump from (a future) |
Wouldn't it be better to deal with it for 2.0? That's less than 6 months away, and at that point there is a hard necessity to deal with C API/ABI stuff. |
Yeah, that's part of what I wanted to discuss here, not just the backwards compat by default, but also 2.0. It also doesn't need an immediate decision, there's no urgency AFAICT. |
I think this will be useful with the 2.0 release, we could pin to |
I would argue to not move away from the current setup. Even if we set
However if we build with NumPy 1.25 and have >=1.25, we are guaranteed that the metadata is correct even though it could have been looser. (This is exactly what we do with macos SDK and deployment target by setting them to the same version by default. For eg: if SDK = 11 and target = 10.9, the symbols introdued in 10.15 are visible, but they need to treated as weak symbols in 10.9 which require the developer to handle it correctly in their C/C++ code) |
Also, a looser pin in that case is not necessary better. Most users will want an updated numpy anyway and having that in place will make it easier (faster) for the solver to provide a solution with it. Sure, there may be a small portion of users who may need older numpy and won't be able to install it but I believe the advantages outweigh the disadvantages. |
Isn't that a general problem that we'll have to look out for in any case? I'm not sure if that is something we could easily determine from a compiled artefact (numpy does embed the C-API level AFAIK), but it seems it would be good to check after building what numpy target version got used That way we could verify that things didn't get lost or overridden by the project or the build system. |
No. See my comment highlighted below
|
I spoke with @rgommers recently, and he mentioned one thing about this that wasn't clear to me before: Packages compiled with numpy 2.0 will continue to be compatible with the 1.x ABI. In other words, if this works out as planned, we could support numpy 2.0 right away without having to do a full CI-bifurcation of all numpy-dependent packages. It would mean using 2.0 as a baseline earlier than we'd do it through NEP29, but given the now built-in backwards compatibility, we could set the pinning to 2.0, and manually set the numpy run-export to do something like
I wasn't talking about the tightness/looseness of the constraints, but about projects setting |
Now that we've started migrating for CPython 3.13 (which requires numpy 2.1), the migrator has a major-only pin: conda-forge-pinning-feedstock/recipe/migrations/python313.yaml Lines 41 to 42 in a94f54b
While the numpy 2 migrator pins to 2.0 conda-forge-pinning-feedstock/recipe/migrations/numpy2.yaml Lines 47 to 51 in a94f54b
Do we want a major-only pin, or still decide when we update the baseline numpy version (in a post-numpy-2.0 world)? For example, using a major-only pin means we'll start pulling in 2.1 as soon as it's available, and this creates a tighter run-export ( I think both approaches are workable, we should just decide on one or the other. |
Would suggest the Python 3.13 migrator be updated to use 2.1 instead of 2. Reasons being
Everything else can stay the same Think if we want to change this more dramatically, we should probably wait for these migrators to complete and reassess. It is always easier to relax things later (as opposed to tightening). Also trying to work in more changes with multiple in-flight migrators is hairy Though open to discussion if others have different opinions |
I support this for the reasons you stated, though at least there's no immediate urgency on this. As there are no numpy 2.0 builds for 3.13, the two are equivalent in this particular case (they wouldn't be for |
I'd probably choose the major-only flavor, because that's the actual requirement. But it doesn't really matter either way, since builds are going to be using 2.1 anyway now that that is available. |
That's not the case; if we pin 2.0, then that's what gets installed in host while building (but 2.1 at runtime of course) |
Major-only meant I think |
Not sure if people saw already, but numpy 1.25 introduced a pretty big change
Also from those release notes, numpy is now planning the long-only-mythical 2.0 release as following 1.26 (which is roughly 1.25 + meson + CPython 3.12 support), so we will have to touch this setup in the not too distant future anyway.
We're currently on 1.22 as per NEP29, so AFAICT we could consider using numpy 1.25 with
NPY_1_22_API_VERSION
as an equivalent setup (this probably needs to go into an activation script for numpy...?).CC @conda-forge/numpy @conda-forge/core
The text was updated successfully, but these errors were encountered: