Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

improper GPU utilization #11

Open
ibrahimrazu opened this issue Jul 2, 2019 · 10 comments
Open

improper GPU utilization #11

ibrahimrazu opened this issue Jul 2, 2019 · 10 comments

Comments

@ibrahimrazu
Copy link

Thanks a lot for sharing your code. While training the NYU dataset, seems like it is not utilizing GPU properly and training time is significantly higher. Could you please tell me how did you configure the GPU there? I tried to configure the GPU as following:
It is noteworthy that i've checked the GPU are working, but utilization is pretty low, almost minimal

GPU_uti

GPU_uti1

@dumyy
Copy link
Owner

dumyy commented Jul 3, 2019

You are using cpu to train the model, so it must be very slow!
I think there is somehing wrong about your gpu config.
You shoud make sure that you can use tensorflow-gpu with right gput config ( like cuda cudnn)

@dumyy
Copy link
Owner

dumyy commented Jul 3, 2019

316MB is the memory that gpu is free.

@ibrahimrazu
Copy link
Author

Thanks a lot for your advice. Yes, its working fine now. I have installed tensorflow-gpu at my conda environment. However, only one GPU is being utilized among all. I tried several combinations, but failed to assign the task to multi-gpus
GPU_uti
GPU_uti2

@dumyy
Copy link
Owner

dumyy commented Jul 4, 2019

Sorry, I have never used multi-gpu config to train the model, so maybe that I can't help you directly.

@ibrahimrazu
Copy link
Author

No problem, thanks a lot! i'll post here if i can do that

@eparvizi
Copy link

@ibrahimrazu did you figure out how to enable multi-gpu processing?
@dumyy my GPU utilization is low, ranging between 15-30%. How would you parallelize and separate CPU/GPU tasks to maximize GPU utilization?

@ibrahimrazu
Copy link
Author

@eparvizi not yet unfortunately. however my single GPU utilization is always highest. I tried with CUDA_VISIBLE_DEVICES=0,1,2,3
but it always uses the first GPU you mention.

@eparvizi
Copy link

@ibrahimrazu
Copy link
Author

Still not. But i’ll try. Thanks!

@mslqing
Copy link

mslqing commented Apr 26, 2021

@ibrahimrazu Hello, how does your program run on GPU? I did not find the program code in the following image in the source program.
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants