Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

uv pip install a wheel with extra index url doesn't resolve, normal pip works in fresh env #2718

Closed
rbavery opened this issue Mar 28, 2024 · 3 comments
Labels
question Asking for clarification or support

Comments

@rbavery
Copy link

rbavery commented Mar 28, 2024

  • A minimal code snippet that reproduces the bug.

I've attached a .zip containing the cpu wheel I'm trying to install with uv.

this works find in a fresh conda env with python 3.10

pip install sedonaai_cpu-0.1.1-py3-none-any.whl --extra-index-url https://download.pytorch.org/whl/cpu

But uv fails to resolve in a fresh venv with python 3.10

→ uv pip install ./sedonaai_cpu-0.1.1-py3-none-any.whl --extra-index-url https://download.pytorch.org/whl/cpu --prerelease=allow
  × No solution found when resolving dependencies:
  ╰─▶ Because torch==2.2.0 is unusable because no wheels are available with a matching Python implementation and torch==2.2.1 is unusable because no wheels are available with a matching
      Python implementation, we can conclude that any of:
          torch==2.2.0
          torch==2.2.1
       cannot be used.
      And because torch==2.2.2 is unusable because no wheels are available with a matching Python implementation, we can conclude that any of:
          torch==2.2.0
          torch==2.2.1
          torch==2.2.2
       cannot be used. (1)

      Because only the following versions of torchvision are available:
          torchvision<=0.17.0
          torchvision==0.17.0+cpu
          torchvision==0.17.1
          torchvision==0.17.1+cpu
          torchvision==0.17.2
          torchvision==0.17.2+cpu
          torchvision>=0.18.0
      and torchvision==0.17.0 is unusable because no wheels are available with a matching Python implementation, we can conclude that torchvision>=0.17.0,<0.17.0+cpu cannot be used.
      And because torchvision==0.17.0+cpu depends on torch==2.2.0, we can conclude that torchvision>=0.17.0,<0.17.1 depends on torch==2.2.0.
      And because torchvision==0.17.1 is unusable because no wheels are available with a matching Python implementation and torchvision==0.17.1+cpu depends on torch==2.2.1, we can
      conclude that torchvision>=0.17.0,<0.17.2 depends on one of:
          torch==2.2.0
          torch==2.2.1

      And because torchvision==0.17.2 is unusable because no wheels are available with a matching Python implementation and torchvision==0.17.2+cpu depends on torch==2.2.2, we can
      conclude that torchvision>=0.17.0 depends on one of:
          torch==2.2.0
          torch==2.2.1
          torch==2.2.2

      And because we know from (1) that any of:
          torch==2.2.0
          torch==2.2.1
          torch==2.2.2
       cannot be used, we can conclude that torchvision>=0.17.0 cannot be used.
      And because sedonaai-cpu==0.1.1 depends on torchvision>=0.17.0, we can conclude that sedonaai-cpu==0.1.1 cannot be used.
      And because only sedonaai-cpu==0.1.1 is available and you require sedonaai-cpu, we can conclude that the requirements are unsatisfiable.
  • The command you invoked (e.g., uv pip sync requirements.txt), ideally including the --verbose flag.

the logs wth verbose are attached
logs.txt

artifact.zip

@charliermarsh
Copy link
Member

My guess is that this is related to our limitations around local version identifiers: https://github.com/astral-sh/uv/blob/main/PIP_COMPATIBILITY.md#local-version-identifiers.

Can you look at the versions of Torch and Torchvision that pip is installing, and add them as direct dependencies? Like:

pip install sedonaai_cpu-0.1.1-py3-none-any.whl torchvision==0.17.0+cpu torch==2.0.0+cpu --extra-index-url https://download.pytorch.org/whl/cpu

@charliermarsh charliermarsh added the question Asking for clarification or support label Mar 28, 2024
@charliermarsh
Copy link
Member

Closing for now in the absence of more info, but let me know if you have other questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Asking for clarification or support
Projects
None yet
Development

No branches or pull requests

3 participants