Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

xformers error #20

Open
shirokalu opened this issue Jun 18, 2024 · 6 comments
Open

xformers error #20

shirokalu opened this issue Jun 18, 2024 · 6 comments

Comments

@shirokalu
Copy link

hi, i pip install xformers-0.0.23.post1-cp39-cp39-manylinux2014_x86_64.whl ,but i meet a error like:xFormers wasn't build with CUDA support
image

@pengHTYX
Copy link
Owner

@shirokalu , hi, can you provide more running information?

@kungfooman
Copy link

wget https://download.pytorch.org/whl/cu118/xformers-0.0.26.post1%2Bcu118-cp39-cp39-manylinux2014_x86_64.whl#sha256=8e862ec2507d2df58b4f1320043c4e5c1496a1c2c9e5c446392b9c9d6bd6ceb7
pip install xformers-0.0.26.post1+cu118-cp39-cp39-manylinux2014_x86_64.whl

(the way I understand it is cu118 in the URL means CUDA support)

@pengHTYX
Copy link
Owner

It seems that xformers+cu118 was not installed successfully.

@kungfooman
Copy link

It seems that xformers+cu118 was not installed successfully.

That's right, xformers was installed, but without CUDA support.

One can use this command for testing: python -m xformers.info

Output should be something like:

Unable to find python bindings at /usr/local/dcgm/bindings/python3. No data will be captured.
xFormers 0.0.23.post1+cu118
memory_efficient_attention.cutlassF:               available
memory_efficient_attention.cutlassB:               available
memory_efficient_attention.decoderF:               available
[email protected]:        available
[email protected]:        available
memory_efficient_attention.smallkF:                available
memory_efficient_attention.smallkB:                available
memory_efficient_attention.tritonflashattF:        unavailable
memory_efficient_attention.tritonflashattB:        unavailable
memory_efficient_attention.triton_splitKF:         available
indexing.scaled_index_addF:                        available
indexing.scaled_index_addB:                        available
indexing.index_select:                             available
swiglu.dual_gemm_silu:                             available
swiglu.gemm_fused_operand_sum:                     available
swiglu.fused.p.cpp:                                available
is_triton_available:                               True
pytorch.version:                                   2.1.2+cu118
pytorch.cuda:                                      available
gpu.compute_capability:                            8.6
gpu.name:                                          NVIDIA GeForce RTX 3090
dcgm_profiler:                                     unavailable
build.info:                                        available
build.cuda_version:                                1108
build.python_version:                              3.9.18
build.torch_version:                               2.1.2+cu118
build.env.TORCH_CUDA_ARCH_LIST:                    5.0+PTX 6.0 6.1 7.0 7.5 8.0+PTX 9.0
build.env.XFORMERS_BUILD_TYPE:                     Release
build.env.XFORMERS_ENABLE_DEBUG_ASSERTIONS:        None
build.env.NVCC_FLAGS:                              None
build.env.XFORMERS_PACKAGE_FROM:                   wheel-v0.0.23.post1
build.nvcc_version:                                11.8.89
source.privacy:                                    open source

At least Era3D works for me 😅

@shirokalu
Copy link
Author

似乎 xformers+cu118 没有安装成功。

没错,安装了 xformers,但没有 CUDA 支持。

可以使用此命令进行测试:python -m xformers.info

输出应如下所示:

Unable to find python bindings at /usr/local/dcgm/bindings/python3. No data will be captured.
xFormers 0.0.23.post1+cu118
memory_efficient_attention.cutlassF:               available
memory_efficient_attention.cutlassB:               available
memory_efficient_attention.decoderF:               available
[email protected]:        available
[email protected]:        available
memory_efficient_attention.smallkF:                available
memory_efficient_attention.smallkB:                available
memory_efficient_attention.tritonflashattF:        unavailable
memory_efficient_attention.tritonflashattB:        unavailable
memory_efficient_attention.triton_splitKF:         available
indexing.scaled_index_addF:                        available
indexing.scaled_index_addB:                        available
indexing.index_select:                             available
swiglu.dual_gemm_silu:                             available
swiglu.gemm_fused_operand_sum:                     available
swiglu.fused.p.cpp:                                available
is_triton_available:                               True
pytorch.version:                                   2.1.2+cu118
pytorch.cuda:                                      available
gpu.compute_capability:                            8.6
gpu.name:                                          NVIDIA GeForce RTX 3090
dcgm_profiler:                                     unavailable
build.info:                                        available
build.cuda_version:                                1108
build.python_version:                              3.9.18
build.torch_version:                               2.1.2+cu118
build.env.TORCH_CUDA_ARCH_LIST:                    5.0+PTX 6.0 6.1 7.0 7.5 8.0+PTX 9.0
build.env.XFORMERS_BUILD_TYPE:                     Release
build.env.XFORMERS_ENABLE_DEBUG_ASSERTIONS:        None
build.env.NVCC_FLAGS:                              None
build.env.XFORMERS_PACKAGE_FROM:                   wheel-v0.0.23.post1
build.nvcc_version:                                11.8.89
source.privacy:                                    open source

至少 Era3D 对我😅有用

I tried to install xformers-0.0.26.post1+cu118-cp39-cp39-manylinux2014_x86_64, but the installation insisted that I install the dependency torch==2.3. I attempted to install it with the --no-deps option, but it still didn't work.

@kungfooman
Copy link

I tried to install xformers-0.0.26.post1+cu118-cp39-cp39-manylinux2014_x86_64, but the installation insisted that I install the dependency torch==2.3. I attempted to install it with the --no-deps option, but it still didn't work.

It also installed torch==2.3 for me, but it worked nicely when running python app.py (not this repo, but the Huggingface one).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants