-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CUDA out of memory issue #40
Comments
From our README:
|
Our CUDA GPU is having 8GB and we thought that was enough. |
You could try increasing the number of splits (ie, how many chunks the data is split into before passing it to the GPU) to reduce the GPU memory requirement. (sorry, a bit hardcoded for now). Rayuela.jl/demos/demos_train_query_base.jl Lines 61 to 62 in ccf22ab
|
Cool. Setting it as follows seems working for 8GB
|
I'm glad it's working. Was this the reason behind issue #38? |
I restarted julia and wasn't able to reproduce #38 Turns out fixing partition size doesn't solve the issue. Calling GC.gc() doesn't free the underlying CUDA memory. Any clues? |
Yes, this is definitely an open issue. The julia GC is a bit of a black box to me, so I never really figured out how to fix this (other than using a larger GPU, which happens to have enough memory for GC to kick in just in time...) I know this is less than ideal. It might be worth trying out calling CuArray's But I'm sorry I can't provide a better fix. |
Related: JuliaGPU/CuArrays.jl#275 |
Sorry to bother again.
What is the minimal memory requirement for the GPU?
The text was updated successfully, but these errors were encountered: