We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tried and some are getting a different error than I am, but the train_lora_XL.py script is broken. My error is that the optimizer args is empty.
The text was updated successfully, but these errors were encountered:
ValueError("optimizer got an empty parameter list")
Anyone attempting to use the SDXL script gets that the regular lora works.
Sorry, something went wrong.
I'm running into this while training with SD v1.5 and v2.1 as well, using the latest code from main.
main
Looking into the error a little bit more, it looks like the list of parameters at https://github.com/p1atdev/LECO/blob/main/train_lora.py#L89 is empty. network.prepare_optimizer_params() is returning an empty list because the list of module types in https://github.com/p1atdev/LECO/blob/main/lora.py#L190 filters everything out.
network.prepare_optimizer_params()
I'm not sure why those modules are all being filtered, but that leaves an empty LoRA, which the optimizer doesn't like very much.
No branches or pull requests
Tried and some are getting a different error than I am, but the train_lora_XL.py script is broken. My error is that the optimizer args is empty.
The text was updated successfully, but these errors were encountered: