You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
WOW, I am not sure what is wrong with github as some of my tickets, including this one, is not emailing me to tell me I had a response. Not online as it was one I had just trained using Kohya.
D:\resize_lora>python resize_lora.py F:/stable-diffusion-webui/models/Stable-diffusion/sd_xl_base_1.0.safetensors F:/stable-diffusion-webui/models/Lora/123_XL_V1.safetensors -o .\ -v -r fro_ckpt=1,thr=-2.0
INFO:root:Processing LoRA model: F:/stable-diffusion-webui/models/Lora/Claymation_XL_V1.safetensors
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.0.0.weight' (320, 4, 3, 3), expected LoRA key: 'lora_unet_input_blocks_0_0'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.1.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_input_blocks_1_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.1.0.in_layers.2.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_1_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.1.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_1_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.2.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_input_blocks_2_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.2.0.in_layers.2.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_2_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.2.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_2_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.3.0.op.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_3_0_op'
INFO:root:No LoRA layer for 'model.diffusion_model.label_emb.0.0.weight' (1280, 2816), expected LoRA key: 'lora_unet_label_emb_0_0'
INFO:root:No LoRA layer for 'model.diffusion_model.label_emb.0.2.weight' (1280, 1280), expected LoRA key: 'lora_unet_label_emb_0_2'
INFO:root:No LoRA layer for 'model.diffusion_model.out.2.weight' (4, 320, 3, 3), expected LoRA key: 'lora_unet_out_2'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.6.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_output_blocks_6_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.6.0.in_layers.2.weight' (320, 960, 3, 3), expected LoRA key: 'lora_unet_output_blocks_6_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.6.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_output_blocks_6_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.6.0.skip_connection.weight' (320, 960, 1, 1), expected LoRA key: 'lora_unet_output_blocks_6_0_skip_connection'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.7.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_output_blocks_7_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.7.0.in_layers.2.weight' (320, 640, 3, 3), expected LoRA key: 'lora_unet_output_blocks_7_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.7.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_output_blocks_7_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.7.0.skip_connection.weight' (320, 640, 1, 1), expected LoRA key: 'lora_unet_output_blocks_7_0_skip_connection'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.8.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_output_blocks_8_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.8.0.in_layers.2.weight' (320, 640, 3, 3), expected LoRA key: 'lora_unet_output_blocks_8_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.8.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_output_blocks_8_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.8.0.skip_connection.weight' (320, 640, 1, 1), expected LoRA key: 'lora_unet_output_blocks_8_0_skip_connection'
INFO:root:No LoRA layer for 'model.diffusion_model.time_embed.0.weight' (1280, 320), expected LoRA key: 'lora_unet_time_embed_0'
INFO:root:No LoRA layer for 'model.diffusion_model.time_embed.2.weight' (1280, 1280), expected LoRA key: 'lora_unet_time_embed_2'
Traceback (most recent call last):
File "D:\resize_lora\resize_lora.py", line 314, in
main()
File "D:\resize_lora\resize_lora.py", line 301, in main
paired = PairedLoraModel(lora_model_path, checkpoint)
File "D:\resize_lora\loralib_init_.py", line 120, in init
raise ValueError(f"Target layer not found for LoRA {lora_layer_keys}")
ValueError: Target layer not found for LoRA lora_unet_input_blocks_1_0_emb_layers_1.diff
The text was updated successfully, but these errors were encountered: