-
Notifications
You must be signed in to change notification settings - Fork 156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Augmentated data up to 15G? #24
Comments
Hi @YNX940214 The size you mentioned about the augmented data is correct, and it does take longer than 4 hours to train in this implementation. The reason is mainly about I didn't use on the fly augmentation during training since I have to use the matlab for bicubic interpolation. Please feel free to modify the code : ) |
@YNX940214 hi, may I ask, how many hours it takes to train the model for 50 epochs? My GPU is TitanX, it seems slow as well. Thx. |
@MingSun-Tse |
@YNX940214 okay, never mind. Thank you still. |
@MingSun-Tse It takes about 20 mins for one epoch on GTX 1080 Ti. |
@twtygqyy hello,I use your code to generated the augmentated date with the matlab script and 291 images, and the train.h5 is about 7GB.I see @YNX940214 train.h5 is about 15G. Could you tell me reason? thanks. |
I generated the augmentated date with the matlab script and 291 images, and the train.h5 is about 15GB.
In the paper the training procedure "takes roughly 4 hours on GPU Titan Z", but it takes far more than 4 hours on my device: GeForce GTX 1080 Ti.
Is it because the device or the augmentated data is too big?
The text was updated successfully, but these errors were encountered: