Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Won't get the GPU to get utilized on MacBook with M3 Max and 128 GB RAM. #946

Open
7 tasks
Gabbelgu opened this issue Oct 8, 2024 · 0 comments
Open
7 tasks

Comments

@Gabbelgu
Copy link

Gabbelgu commented Oct 8, 2024

Describe the bug
I won't get the GPU to get utilized on my MacBook.
Other apps like LLM can utilize up to 70 GB RAM for the graphic processor.

To Reproduce
Steps to reproduce the behavior:
I've enabled CoreML, Max. Number of Threads = 18, GFPGAN and the other processors.
Same problem with Max. Number of Threads = 3, GFPGAN and the other processors..
Same problem with Max. Number of Threads = 8, GFPGAN and the other processors..

My configuration is:

MacBook Pro 16" 2023
M3 Max
128 GB RAM
Python 3.11
The rate is quite low like 1 to 2s / frames, and it mostly hangs up, not going forward for 3-5s, then recalculates to 1-2s / frames.

Details
What OS are you using?

  • Linux
  • Linux in WSL
  • Windows
  • [ x] Mac

Are you using a GPU?

  • No. CPU FTW
  • NVIDIA
  • AMD
  • Intel
  • [ x] Mac

Which version of roop unleashed are you using?
4.3.1

Screenshots
If applicable, add screenshots to help explain your problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant