-
Notifications
You must be signed in to change notification settings - Fork 722
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fork when wheels only support a specific range #8492
Comments
This is known problem with scientific packages unfortunately. Does splitting the scipy version on the python version as documented at the bottom of https://github.com/astral-sh/uv/blob/main/docs/reference/build_failures.md (this doc will ship with the next release) help? |
Unfortunately not:
It tries to build NumPy 2.0.2 from source on Python 3.13, no idea why it doesn't just use >= 2.1:
Maybe because it also uses NumPy 2.0.2 on Python <= 3.12? Besides, I think splitting like this feels a bit off, since the ranges actually overlap (unlike in the example). |
You need to also split numpy, probably as a constraint since it's not a direct dependency: [project]
name = "example"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.9"
dependencies = [
"scipy >=1.13.1,<1.14.0; python_version < '3.13'",
"scipy >=1.14.0; python_version >= '3.13'",
]
[tool.uv]
constraint-dependencies = [
# Supports 3.8 to 3.10
"numpy <=2.1.0,>=1.22.4; python_version < '3.13'",
# Support 3.10 to 3.13
"numpy >=2.1.0; python_version >= '3.13'",
] Overlapping version ranges are fine in uv's model, but you can also make them exclusive as in the example above. Fwiw it would be nice if we could make this split automatically for non-abi3 wheels. |
Uhm, can you explain the reasoning behind this resolution? Why does uv not pick scipy 1.14.1 for Python 3.13? I'm explicitly specifying |
This feels very un-intuitive to me, too 🤯 |
There's three parts of the uv resolver interacting here: requires-python, forking and wheel compatibility. uv will only select packages with a requires-python lowest bound at least as high as your project. For example, your project has uv also has the forking mechanism (https://docs.astral.sh/uv/reference/resolver-internals/#forking): Whenever we see that a dependency has different requirements for different markers in the same package, we fork and solve once for the one markers and once for the other markers (assuming the markers are disjoint, if they aren't we split some more until we have disjoint environments, there are some implementation details to it). Forking can tighten the requires-python range: If we know we're in an environment from a uv is conservative with forking, we only fork on above condition. In a single fork, we have to enforce that there is only a single version per package per resolution (since we can only install a single version). uv also tries to reuse versions between forks when possible, to reduce the divergence between them (less different versions, less things that can go wrong/need to be tested). The latter is the reason that you often have to add an upper bound on the version for getting the right forking results. Pure python projects don't have an upper bound on their python compatibility: Usually a package published now will work on future python versions. Even for most compiled projects with per-python version shared libraries, you can use them on a newer version by building the source dist. For these and reasons related to how our universal dependency resolution works, we're discarding upper bounds on requires-python (long explanation in #4022). For numpy, scipy and a number of related projects this assumption isn't true, they really only work for a specific Python range for each version, and you're also not supposed to build them from source (often failing to compile on newer Python versions). Here's the part where uv falls short and you need the manual intervention of specifying |
Thanks for this detailed explanation! I do like how thoroughly uv has been designed, but given that the scientific ecosystem now pushes it beyond its boundaries, I feel like it makes it very hard to use uv in this context. In the simple example with just a single
I'm not sure if I understand what you're saying, but do you think that uv should do this automatically so that my original example "just works"? |
Also, |
I would like to second this ☝️ |
For numpy at least, Pick a numpy version, analyse the range of https://pypi.org/project/numpy/#files, and if we see consistent range of compiled |
Another possible solution would be #7190 |
You should never add an upper bounds to Requires-Python. Most packages don't know it until after they are released, and even for ones that know they won't support the next version (like numpy), the existing handing of this field is wrong and things like pip and poetry break if you cap. Looking at the binary wheels available to guide version ranges would be really helpful, though, I think, especially if #7190 wasn't the default. You could also check to see if classifiers are present, and only consider a classifier disappearing as you look at older packages to be the starting point of support for that Python version (for packages with universal wheels). Many packages don't have classifiers, but for the ones that do, having a classifiers is a sign of solid support. (That's a lot weirder, though, I don't think a single tool today checks classifiers during a solve). https://discuss.python.org/t/requires-python-upper-limits/12663 was a discussion about what to do with upper caps. |
Yes, I think #7190 would be a great default. In my example at the top, I'm not setting an upper bound at all. |
I don't know if this is intentional, but the following project cannot be installed on Python 3.13 if I want to avoid building source distributions:
uv init example
cd example
pyproject.toml
:uv sync --python=3.12
works (same for3.9
,3.10
, and3.11
)uv sync --python=3.13 --no-build-package=scipy
fails, even though it could (should?) usescipy==1.14.1
, which is available as a binary wheel.This surprised me because I expected that by specifying
scipy >= 1.13.1
, uv would automatically choose a newer version with a binary wheel for Python 3.13. Indeed, there are binary wheels for version 1.13.1 available for Python 3.9 through 3.12, but only version ≥ 1.14.0 has a binary wheel for Python 3.13 (though it no longer supports 3.9). However, it seems that uv is trying to maintain consistent package versions across all supported Python versions.The text was updated successfully, but these errors were encountered: