Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

new dadaptation eats up a lot of vram #196

Closed
Nazzaroth2 opened this issue Feb 18, 2023 · 1 comment
Closed

new dadaptation eats up a lot of vram #196

Nazzaroth2 opened this issue Feb 18, 2023 · 1 comment

Comments

@Nazzaroth2
Copy link

Nazzaroth2 commented Feb 18, 2023

i just updated and noticed i can not use my normal amount of batch-sizes anymore in finetuning. As mentioned here #195 i am easily able to use batchsizes of 50+ on commit 6129c7d.
After that i can't even use 35 before i get oom.

So my question is, what is dadaptation and what is it good for?

EDIT:
Just saw in the repository that dadaptaiton is another adamOptimizer? Could it be that this new setting overwrites 8bit adam option?

@ThisIsCyreX
Copy link

I noticed it too. Since D-Adaptation was added, I can't train on 6GB anymore.
Would be nice to disable it on low(er) VRAM.

Cauldrath pushed a commit to Cauldrath/kohya_ss that referenced this issue Apr 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants