-
Notifications
You must be signed in to change notification settings - Fork 615
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support ROCm #807
Comments
Hi, |
May I know what will be the minimum amount of changes that it'll take for the dev team to be able to develop and run tests on AMD GPUs? Is Nvidia also directly helping the dev team in acquiring up-to-date hardware to develop and run tests on for the Nvidia side of things? 👀 |
Hi @tedliosu |
Thank you for the clarification. So what you're saying basically boils down to supporting an additional GPU vendor not only being at least twice the amount of work on the devs' end, but also that there's a lack of "market share" of AMD GPUs at least amongst those companies that the dev team work with? Also, I remember reading amongst the issues here that pull requests for supporting xformers for AMD GPUs are welcome? Especially since the devs currently have no way of testing any code on AMD GPUs, will any pull requests that add support for AMD GPUs also need to include additional AMD-specific tests from the person submitting them to ensure their correctness? |
Ideally we would want to have a discussion with the person first, as this might be quite some work, and we need to evaluate the best way moving forward in that direction. But in principle we welcome contributions :) |
elaborate who you mean "the person" and amd has the best marketshare on gaming wise, but nvidia obviously on AI and lately overall because of AI and overall compatability leaning towards nvidia for AI, but ROCm is making strides to bringing ROCm to everything currently on linux all the AMD GPU's are supported, however on Windows only the latest AMD GPU's are, it would more than likely be just to do with Triton I agree, but ROCm has open source repos which can be found here for the runtimes and so on https://github.com/RadeonOpenCompute it is literally a wrapper for CUDA |
I have made multiple guides that I have merged into 2 to installing ROCm on linux for different platforms |
Someone who has both the knowledge and time to add support for ROCm + maintain it going forward My understanding is that ROCm is able to compile CUDA code into something that can run on AMD GPUs. However, this is not possible for the CUDA code we write, because we rely on third-party libraries like CUTLASS who use inline-PTX which is not compilable with ROCm. But if you find a way to get our kernels to run with ROCm we can discuss |
Seem's AMD has forked his to add support https://github.com/ROCm/xformers/ |
We're working with AMD to add minimal support for AMD GPUs to xFormers: |
so it wont be something I can use by using pip install? or how would that work |
Most likely you will need to build from source (which is also possible via pip, but takes more time), and it would only support inference at first |
can you update me when the pull goes through? |
Hum I believe you should be able to subscribe to the PR #978 |
is this going to be a ongoing project for a while? (to finish the first testable) |
🚀 Feature
Support ROCm on AI generation
Motivation
would like to be able to use xformers on my linux rocm install of stable diffusion
Pitch
Alternatives
Additional context
The text was updated successfully, but these errors were encountered: