Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SDXL training doesn't work. #34

Open
DarkAlchy opened this issue Nov 20, 2023 · 2 comments
Open

SDXL training doesn't work. #34

DarkAlchy opened this issue Nov 20, 2023 · 2 comments

Comments

@DarkAlchy
Copy link

Tried and some are getting a different error than I am, but the train_lora_XL.py script is broken. My error is that the optimizer args is empty.

@DarkAlchy DarkAlchy reopened this Nov 26, 2023
@DarkAlchy
Copy link
Author

ValueError("optimizer got an empty parameter list")

Anyone attempting to use the SDXL script gets that the regular lora works.

@ssube
Copy link

ssube commented Dec 14, 2023

I'm running into this while training with SD v1.5 and v2.1 as well, using the latest code from main.

Looking into the error a little bit more, it looks like the list of parameters at https://github.com/p1atdev/LECO/blob/main/train_lora.py#L89 is empty. network.prepare_optimizer_params() is returning an empty list because the list of module types in https://github.com/p1atdev/LECO/blob/main/lora.py#L190 filters everything out.

I'm not sure why those modules are all being filtered, but that leaves an empty LoRA, which the optimizer doesn't like very much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants