-
-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please make sure to include "CUDA is Required" in the readme. #53
Comments
I understand your situation. I apologize for any inconvenience caused by the installation process. If you don't have a GPU, you can use the CPU version of the transformers. However, I use MacOS, so I cannot test whether it works on other devices besides MPS! |
By the way, on which step did you get this message? |
After installing everything and running the I haven't found any way of using transformers without CUDA, can you elaborate further? |
in your config.json, change device to CPU and set gpu layers to 0 |
Wait a minute, we’re already working on the solution. If you don’t mind, can you please DM me on Discord (search for |
I made a pull request that will use CPU by default |
Is your feature request related to a problem? Please describe.
I put a few hours into installation without knowing CUDA was required, my system doesn't have a GPU, just a 12th gen i7 so I thought I could handle it, after a few hours of struggle, I got this message and it genuinely upset me:
CUDA error 100 at /home/runner/work/ctransformers/ctransformers/models/ggml/ggml-cuda.cu:5067: no CUDA-capable device is detected
Describe the solution you'd like
Please specify CUDA being required in the readme with big letters.
The text was updated successfully, but these errors were encountered: