-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bug]: v5.0.1 appears to have broken Flux LoRAs that worked in v5.0.0 #6996
Comments
Note I've not tested this with the full size Flux models, only the quantized versions. |
We did regression testing before the latest update and the LoRAs we tested on v5.0.0 worked on v5.0.1 so we'll need specific examples to troubleshoot. Can you please link to a specific LoRA that doesn't work any more? |
Sorry, I can't share the ones I was specifically finding this issue with as they are based on personal photos. They were trained using the AI Toolkit Gradio UI for training Flux LoRAs (https://github.com/ostris/ai-toolkit?tab=readme-ov-file#gradio-ui), using this RunPod template: https://www.runpod.io/console/explore/unbq8sxjpe I'm not sure if this helps you narrow down the issue. Regardless, I'll try to find a publicly available LoRA that is affected or train a new one that I can share. I can confirm that switching back to v5.0.0 does resolve this issue for me, so there is definitely a problem for certain LoRA types. |
The following is using a LoRA designed to emulate the style of a particular scifi TV show, that I have trained using AI Toolkit. It's not a great style LoRA, but it demonstrates the issue. The following images use the settings:
In each of the following set of images, the LoRA is enabled on the left, and disable on the right. v5.0.0 v5.0.1 Note that both images look identical in v5.0.1, indicating that the LoRA is having no effect. |
Is there an existing issue for this problem?
Operating system
Linux
GPU vendor
Nvidia (CUDA)
GPU model
NVIDIA GeForce RTX 3060
GPU VRAM
12GB
Version number
5.0.1
Browser
Brave Version 1.70.117 Chromium: 129.0.6668.59 (Official Build) (64-bit)
Python dependencies
No response
What happened
Although v5.0.1 appears to have fixed support for Kohya created Flux LoRAs, support for other Flux LoRA types (such as those created by AI Toolkit) no longer seem to have any effect on the images.
What you expected to happen
I expect previously working Flux LoRA types to affect the images in the same way they did in v5.0.0.
How to reproduce the problem
Start InvokeAI, attempt to use a Flux LoRA that was not created by Kohya within the linear image generation (using Flux dev or schnell, quantized versions). Notice that the image output does not appear to be affected.
Additional context
No response
Discord username
No response
The text was updated successfully, but these errors were encountered: