Replies: 2 comments 1 reply
-
The typical diffusion finetuning loss curves look like they are randomly fluctuating between 0 and 1. You are probably seeing expected behaviour, and your dreambooth lora has learned the concept you're trying to train - you could verify with inference using the lora. |
Beta Was this translation helpful? Give feedback.
1 reply
-
If your loss curve looks like this, it is expected behaviour. If it's fluctuating but not decreasing in general, there may be something wrong. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When I trained Dreambooth using train_dreambooth_lora.py , the loss always keeps fluctuating between 0.01 and 1.0. The setup I used was almost same as the readme.md :
Beta Was this translation helpful? Give feedback.
All reactions