forked from huggingface/tgi-gaudi
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Making
make install
work better by default. (huggingface#2004)
# What does this PR do? Making `make install` a much better sane default to start local dev environments. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ @OlivierDehaene OR @Narsil -->
- Loading branch information
1 parent
648dd7b
commit ed89135
Showing
9 changed files
with
347 additions
and
299 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,16 +1,14 @@ | ||
flash_att_commit := 3a9bfd076f98746c73362328958dbc68d145fbec | ||
|
||
flash-attention: | ||
# Clone flash attention | ||
pip install -U packaging ninja --no-cache-dir | ||
git clone https://github.com/HazyResearch/flash-attention.git | ||
|
||
build-flash-attention: flash-attention | ||
cd flash-attention && git fetch && git checkout $(flash_att_commit) | ||
cd flash-attention && python setup.py build | ||
cd flash-attention/csrc/rotary && python setup.py build | ||
cd flash-attention/csrc/layer_norm && python setup.py build | ||
build-flash-attention: | ||
if [ ! -d 'flash-attention' ]; then \ | ||
pip install -U packaging ninja --no-cache-dir && \ | ||
git clone https://github.com/HazyResearch/flash-attention.git && \ | ||
cd flash-attention && git fetch && git checkout $(flash_att_commit) && \ | ||
MAX_JOBS=8 python setup.py build && cd csrc/layer_norm && python setup.py build && cd ../rotary && python setup.py build; \ | ||
fi | ||
|
||
install-flash-attention: build-flash-attention | ||
pip uninstall flash_attn rotary_emb dropout_layer_norm -y || true | ||
cd flash-attention && python setup.py install && cd csrc/layer_norm && python setup.py install && cd ../rotary && python setup.py install | ||
if [ ! -d 'flash-attention' ]; then \ | ||
cd flash-attntion && python setup.py install && cd csrc/layer_norm && python setup.py install && cd ../rotary && python setup.py install; \ | ||
fi |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,29 +1,24 @@ | ||
flash_att_v2_commit_cuda := v2.5.8 | ||
flash_att_v2_commit_cuda := v2.5.9.post1 | ||
flash_att_v2_commit_rocm := 2554f490101742ccdc56620a938f847f61754be6 | ||
|
||
build-flash-attention-v2-cuda: | ||
pip install -U packaging wheel | ||
pip install flash-attn==$(flash_att_v2_commit_cuda) | ||
|
||
flash-attention-v2-cuda: | ||
# Clone flash attention | ||
pip install -U packaging ninja --no-cache-dir | ||
git clone https://github.com/Dao-AILab/flash-attention.git flash-attention-v2 | ||
install-flash-attention-v2-cuda: | ||
pip install -U packaging wheel | ||
pip install flash-attn==$(flash_att_v2_commit_cuda) | ||
|
||
build-flash-attention-v2-cuda: flash-attention-v2-cuda | ||
cd flash-attention-v2 && git fetch && git checkout $(flash_att_v2_commit_cuda) | ||
cd flash-attention-v2 && git submodule update --init --recursive | ||
cd flash-attention-v2 && python setup.py build | ||
|
||
install-flash-attention-v2-cuda: build-flash-attention-v2-cuda | ||
cd flash-attention-v2 && git submodule update --init --recursive && python setup.py install | ||
|
||
flash-attention-v2-rocm: | ||
# Clone flash attention | ||
pip install -U packaging ninja --no-cache-dir | ||
git clone https://github.com/ROCm/flash-attention.git flash-attention-v2 | ||
|
||
build-flash-attention-v2-rocm: flash-attention-v2-rocm | ||
cd flash-attention-v2 && git fetch && git checkout $(flash_att_v2_commit_rocm) | ||
cd flash-attention-v2 && git submodule update --init --recursive | ||
cd flash-attention-v2 && GPU_ARCHS="gfx90a;gfx942" PYTORCH_ROCM_ARCH="gfx90a;gfx942" python setup.py build | ||
build-flash-attention-v2-rocm: | ||
if [ ! -d 'flash-attention-v2' ]; then \ | ||
pip install -U packaging ninja --no-cache-dir && \ | ||
git clone https://github.com/ROCm/flash-attention.git flash-attention-v2 && \ | ||
cd flash-attention-v2 && git fetch && git checkout $(flash_att_v2_commit_rocm) && \ | ||
git submodule update --init --recursive && GPU_ARCHS="gfx90a;gfx942" PYTORCH_ROCM_ARCH="gfx90a;gfx942" python setup.py build; \ | ||
fi | ||
|
||
install-flash-attention-v2-rocm: build-flash-attention-v2-rocm | ||
cd flash-attention-v2 && git submodule update --init --recursive && python setup.py install | ||
if [ ! -d 'flash-attention-v2' ]; then \ | ||
cd flash-attention-v2 && \ | ||
GPU_ARCHS="gfx90a;gfx942" PYTORCH_ROCM_ARCH="gfx90a;gfx942" python setup.py install; \ | ||
fi |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,25 +1,26 @@ | ||
vllm-cuda: | ||
# Clone vllm | ||
pip install -U ninja packaging --no-cache-dir | ||
git clone https://github.com/Narsil/vllm.git vllm | ||
|
||
build-vllm-cuda: vllm-cuda | ||
cd vllm && git fetch && git checkout b5dfc61db88a81069e45b44f7cc99bd9e62a60fa | ||
cd vllm && python setup.py build | ||
|
||
build-vllm-cuda: | ||
if [ ! -d 'vllm' ]; then \ | ||
pip install -U ninja packaging --no-cache-dir && \ | ||
git clone https://github.com/Narsil/vllm.git vllm &&\ | ||
cd vllm && \ | ||
git fetch && git checkout b5dfc61db88a81069e45b44f7cc99bd9e62a60fa &&\ | ||
python setup.py build; \ | ||
fi | ||
install-vllm-cuda: build-vllm-cuda | ||
pip uninstall vllm -y || true | ||
cd vllm && python setup.py install | ||
|
||
vllm-rocm: | ||
# Clone vllm | ||
pip install -U ninja packaging --no-cache-dir | ||
git clone https://github.com/fxmarty/rocm-vllm.git vllm | ||
if [ ! -d 'vllm' ]; then \ | ||
cd vllm && pip install -e .; \ | ||
fi | ||
|
||
build-vllm-rocm: vllm-rocm | ||
cd vllm && git fetch && git checkout ca6913b3c2ffacdcb7d15e914dc34adbc6c89479 | ||
cd vllm && PYTORCH_ROCM_ARCH="gfx90a;gfx942" python setup.py install | ||
build-vllm-rocm: | ||
if [ ! -d 'vllm' ]; then \ | ||
pip install -U ninja packaging --no-cache-dir && \ | ||
git clone https://github.com/fxmarty/rocm-vllm.git vllm && \ | ||
cd vllm && git fetch && git checkout ca6913b3c2ffacdcb7d15e914dc34adbc6c89479 && \ | ||
PYTORCH_ROCM_ARCH="gfx90a;gfx942" python setup.py build; \ | ||
fi | ||
|
||
install-vllm-rocm: build-vllm-rocm | ||
pip uninstall vllm -y || true | ||
cd vllm && python setup.py install | ||
if [ ! -d 'vllm' ]; then \ | ||
cd vllm && \ | ||
PYTORCH_ROCM_ARCH="gfx90a;gfx942" pip install -e .; \ | ||
fi |
Oops, something went wrong.