Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use pyproject.toml to specify build dependencies #193

Merged
merged 1 commit into from
May 12, 2023

Conversation

anthonyhu
Copy link
Contributor

This PR introduces a pyproject.toml file to list the build dependencies (see https://snarky.ca/what-the-heck-is-pyproject-toml/).

Before, the build dependencies were listed in setup.py, but in order to read this file, you would need to have those dependencies installed (packaging, torch). By adding a pyproject.toml file, python knows which packages are needed to build flash-attention.

@ntoxeg
Copy link

ntoxeg commented May 10, 2023

This might make it PEP 517 compatible, which would fix #209. I would be grateful for prioritizing this, as I can’t install this otherwise.

@ntoxeg
Copy link

ntoxeg commented May 10, 2023

I can confirm it fixes #209. Please merge ASAP.

@tridao tridao merged commit 36d0a19 into Dao-AILab:main May 12, 2023
@tridao
Copy link
Contributor

tridao commented Jul 3, 2023

I had to remove pyproject.toml for now since I couldn't find a way to add torch as a build dependency that work for everyone.

@kklemon
Copy link

kklemon commented Sep 28, 2023

@tridao That's really unfortunate. Removing the pyproject.toml breaks the PEP517 compliance and thus Poetry compatibility.

Handling the PyTorch dependency with that build system is indeed not trivial. In my own projects, I use one of the following solutions:

  1. Just don't declare torch as a dependency and let the user install the preferred version. Probably only works for small projects where this is reasonable.
  2. Use a custom source definition with the pyproject.toml and point to a specific torch version. This is my preferred solution at the moment. It does not solve the problem of exact CUDA version matching, but should still work for most users. If the user overwrites the torch version, e.g. with a specific CUDA version, the Poetry dependency resolver will typically respect it if the pyproject.toml's version declaration is still met.

For the second solution, an exemplary dependency configuration would look as follows:

[tool.poetry.dependencies]
torch = {version = "^2.0.1", source = "pytorch-gpu-src"}

[[tool.poetry.source]]
name = "pytorch-gpu-src"
url = "https://download.pytorch.org/whl/cu118"
priority = "explicit"

ziyuhuang123 pushed a commit to ziyuhuang123/flash-attention that referenced this pull request Jan 21, 2024
Use pyproject.toml to specify build dependencies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants