-
How much VRAM and time does training take? |
Beta Was this translation helpful? Give feedback.
Answered by
scarbain
Feb 20, 2023
Replies: 1 comment 7 replies
-
According to their paper, they trained each adapter for 10 epochs with a batch size of 8 on at least 120K images. |
Beta Was this translation helpful? Give feedback.
7 replies
Answer selected by
Njasa2k
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
According to their paper, they trained each adapter for 10 epochs with a batch size of 8 on at least 120K images.
I've managed to run the training with a batch size of only 1 with 12GB VRAM.