Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fork when wheels only support a specific range #8492

Open
cbrnr opened this issue Oct 23, 2024 · 13 comments
Open

Fork when wheels only support a specific range #8492

cbrnr opened this issue Oct 23, 2024 · 13 comments
Labels
enhancement New feature or improvement to existing functionality

Comments

@cbrnr
Copy link

cbrnr commented Oct 23, 2024

I don't know if this is intentional, but the following project cannot be installed on Python 3.13 if I want to avoid building source distributions:

  1. uv init example
  2. cd example
  3. Use this pyproject.toml:
    [project]
    name = "example"
    version = "0.1.0"
    description = "Add your description here"
    readme = "README.md"
    requires-python = ">=3.9"
    dependencies = [
        "scipy >= 1.13.1",
    ]
    
  4. uv sync --python=3.12 works (same for 3.9, 3.10, and 3.11)
  5. uv sync --python=3.13 --no-build-package=scipy fails, even though it could (should?) use scipy==1.14.1, which is available as a binary wheel.

This surprised me because I expected that by specifying scipy >= 1.13.1, uv would automatically choose a newer version with a binary wheel for Python 3.13. Indeed, there are binary wheels for version 1.13.1 available for Python 3.9 through 3.12, but only version ≥ 1.14.0 has a binary wheel for Python 3.13 (though it no longer supports 3.9). However, it seems that uv is trying to maintain consistent package versions across all supported Python versions.

@konstin
Copy link
Member

konstin commented Oct 23, 2024

This is known problem with scientific packages unfortunately. Does splitting the scipy version on the python version as documented at the bottom of https://github.com/astral-sh/uv/blob/main/docs/reference/build_failures.md (this doc will ship with the next release) help?

@konstin konstin added the question Asking for clarification or support label Oct 23, 2024
@cbrnr
Copy link
Author

cbrnr commented Oct 23, 2024

Unfortunately not:

[project]
name = "example"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.9"
dependencies = [
    "scipy >= 1.13.1; python_version <= '3.12'",
    "scipy >= 1.14.0; python_version >= '3.13'",
]

It tries to build NumPy 2.0.2 from source on Python 3.13, no idea why it doesn't just use >= 2.1:

❯ uv sync --python=3.13 --no-build-package=scipy --no-build-package=numpy
Using CPython 3.13.0
Removed virtual environment at: .venv
Creating virtual environment at: .venv
Resolved 4 packages in 0.97ms
error: Distribution `numpy==2.0.2 @ registry+https://pypi.org/simple` can't be installed because it is marked as `--no-build` but has no binary distribution

Maybe because it also uses NumPy 2.0.2 on Python <= 3.12?

Besides, I think splitting like this feels a bit off, since the ranges actually overlap (unlike in the example).

@konstin
Copy link
Member

konstin commented Oct 23, 2024

You need to also split numpy, probably as a constraint since it's not a direct dependency:

[project]
name = "example"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.9"
dependencies = [
  "scipy >=1.13.1,<1.14.0; python_version < '3.13'",
  "scipy >=1.14.0; python_version >= '3.13'",
]

[tool.uv]
constraint-dependencies = [
  # Supports 3.8 to 3.10
  "numpy <=2.1.0,>=1.22.4; python_version < '3.13'",
  # Support 3.10 to 3.13
  "numpy >=2.1.0; python_version >= '3.13'",
]

Overlapping version ranges are fine in uv's model, but you can also make them exclusive as in the example above.

Fwiw it would be nice if we could make this split automatically for non-abi3 wheels.

@cbrnr
Copy link
Author

cbrnr commented Oct 23, 2024

Uhm, can you explain the reasoning behind this resolution? Why does uv not pick scipy 1.14.1 for Python 3.13? I'm explicitly specifying scipy >= 1.13.1, so 1.14.1 is perfectly valid?

@hoechenberger
Copy link

This feels very un-intuitive to me, too 🤯

@konstin
Copy link
Member

konstin commented Oct 23, 2024

There's three parts of the uv resolver interacting here: requires-python, forking and wheel compatibility. uv will only select packages with a requires-python lowest bound at least as high as your project. For example, your project has requires-python = ">=3.9", while scipy 1.14.1 has Requires-Python: >=3.10 (https://inspector.pypi.io/project/scipy/1.14.1/packages/64/68/3bc0cfaf64ff507d82b1e5d5b64521df4c8bf7e22bc0b897827cbee9872c/scipy-1.14.1-cp310-cp310-macosx_10_13_x86_64.whl/scipy-1.14.1.dist-info/METADATA), so we can't select this version and must choose an older version. (If we would select that version, you would get an error trying to uv sync on Python 3.9)

uv also has the forking mechanism (https://docs.astral.sh/uv/reference/resolver-internals/#forking): Whenever we see that a dependency has different requirements for different markers in the same package, we fork and solve once for the one markers and once for the other markers (assuming the markers are disjoint, if they aren't we split some more until we have disjoint environments, there are some implementation details to it). Forking can tighten the requires-python range: If we know we're in an environment from a python_version >= "3.12" marker, we can pick a more recent version - such as scipy 1.14.1 - since the python_version >= "3.9" and python_version < "3.12" range is already handled by the other fork, which can select an older version.

uv is conservative with forking, we only fork on above condition. In a single fork, we have to enforce that there is only a single version per package per resolution (since we can only install a single version). uv also tries to reuse versions between forks when possible, to reduce the divergence between them (less different versions, less things that can go wrong/need to be tested). The latter is the reason that you often have to add an upper bound on the version for getting the right forking results.

Pure python projects don't have an upper bound on their python compatibility: Usually a package published now will work on future python versions. Even for most compiled projects with per-python version shared libraries, you can use them on a newer version by building the source dist. For these and reasons related to how our universal dependency resolution works, we're discarding upper bounds on requires-python (long explanation in #4022).

For numpy, scipy and a number of related projects this assumption isn't true, they really only work for a specific Python range for each version, and you're also not supposed to build them from source (often failing to compile on newer Python versions). Here's the part where uv falls short and you need the manual intervention of specifying python_version markers: uv can't detect that we're trying a package that has compatibility and wheels only for a specific range, and we need to fork to satisfy the full Python version range from the lowest version your project requires to the latest version the dependency supports.

@cbrnr
Copy link
Author

cbrnr commented Oct 23, 2024

Thanks for this detailed explanation! I do like how thoroughly uv has been designed, but given that the scientific ecosystem now pushes it beyond its boundaries, I feel like it makes it very hard to use uv in this context. In the simple example with just a single scipy dependency I have to manually specify constraints for numpy, but in a more realistic project there will probably be dozens of such constraints that I don't want to handle manually.

Fwiw it would be nice if we could make this split automatically for non-abi3 wheels.

I'm not sure if I understand what you're saying, but do you think that uv should do this automatically so that my original example "just works"?

@cbrnr
Copy link
Author

cbrnr commented Oct 24, 2024

Also, pip resolves these requirements differently: it installs scipy==1.14.1 on >=3.10 and scipy==1.13.1 on ==3.9 (together with suitable binary wheels for numpy). This is the expected behavior at least for me, but probably also for the Python community.

@hoechenberger
Copy link

This is the expected behavior at least for me, but probably also for the Python community.

I would like to second this ☝️

@konstin konstin added enhancement New feature or improvement to existing functionality and removed question Asking for clarification or support labels Oct 25, 2024
@konstin konstin changed the title Dependency resolution across Python versions Fork when wheels only support a specific range Oct 25, 2024
@konstin
Copy link
Member

konstin commented Oct 25, 2024

For numpy at least, Requires-Python doesn't tell us about the upper bounds (https://inspector.pypi.io/project/numpy/2.1.2/packages/1c/a2/40a76d357f168e9f9f06d6cc2c8e22dd5fb2bfbe63fe2c433057258c145a/numpy-2.1.2-cp310-cp310-macosx_10_9_x86_64.whl/numpy-2.1.2.dist-info/METADATA#line.999), so we would need to find a strategy of forking dependent on the wheel on PyPI, e.g.:

Pick a numpy version, analyse the range of https://pypi.org/project/numpy/#files, and if we see consistent range of compiled cp3x tags while there exists a newer version that covers a wider range, fork and so we can pick a newer numpy on newer Python versions.

@konstin
Copy link
Member

konstin commented Oct 29, 2024

Another possible solution would be #7190

@henryiii
Copy link
Contributor

henryiii commented Oct 29, 2024

You should never add an upper bounds to Requires-Python. Most packages don't know it until after they are released, and even for ones that know they won't support the next version (like numpy), the existing handing of this field is wrong and things like pip and poetry break if you cap. Looking at the binary wheels available to guide version ranges would be really helpful, though, I think, especially if #7190 wasn't the default. You could also check to see if classifiers are present, and only consider a classifier disappearing as you look at older packages to be the starting point of support for that Python version (for packages with universal wheels). Many packages don't have classifiers, but for the ones that do, having a classifiers is a sign of solid support. (That's a lot weirder, though, I don't think a single tool today checks classifiers during a solve).

https://discuss.python.org/t/requires-python-upper-limits/12663 was a discussion about what to do with upper caps.

@cbrnr
Copy link
Author

cbrnr commented Oct 29, 2024

Yes, I think #7190 would be a great default. In my example at the top, I'm not setting an upper bound at all.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or improvement to existing functionality
Projects
None yet
Development

No branches or pull requests

4 participants