Releases: bmaltais/kohya_ss
Releases · bmaltais/kohya_ss
v21.3.9
- 2023/04/01 (v21.3.9)
- Update how setup is done on Windows by introducing a setup.bat script. This will make it easier to install/re-install on Windows if needed. Many thanks to @missionfloyd for his PR: #496
- Fix issue with WD14 caption script by applying a custom fix to kohya_ss code.
v21.3.8
v21.3.7
v21.3.6
- 2023/03/28 (v21.3.6)
- Fix issues when
--persistent_data_loader_workers
is specified.- The batch members of the bucket are not shuffled.
--caption_dropout_every_n_epochs
does not work.- These issues occurred because the epoch transition was not recognized correctly. Thanks to u-haru for reporting the issue.
- Fix an issue that images are loaded twice in Windows environment.
- Add Min-SNR Weighting strategy. Details are in #308. Thank you to AI-Casanova for this great work!
- Add
--min_snr_gamma
option to training scripts, 5 is recommended by paper. - The Min SNR gamma fiels can be found unser the advanced training tab in all trainers.
- Add
- Fixed the error while images are ended with capital image extensions. Thanks to @kvzn. #454
- Fix issues when
v21.3.5
v21.3.4
-
2023/03/25 (v21.3.4)
- Added untested support for MacOS base on this gist: https://gist.github.com/jstayco/9f5733f05b9dc29de95c4056a023d645
Let me know how this work. From the look of it it appear to be well tought out. I modified a few things to make it fit better with the rest of the code in the repo.
- Fix for issue #433 by implementing default of 0.
- Removed non applicable save_model_as choices for LoRA and TI.
v21.3.3
- 2023/03/24 (v21.3.3)
- Add support for custom user gui files. They will be created at installation time or when upgrading if missing. You will see two files in the root of the folder. One named
.\gui-user.bat
and the other.\gui-user.ps1
. Edit the file based on your prefered terminal. Simply add the parameters you want to pass the gui in there and execute it to start the gui with them. Enjoy!
- Add support for custom user gui files. They will be created at installation time or when upgrading if missing. You will see two files in the root of the folder. One named
To get a full list of parameters run: .\gui.bat -h
or .\gui.ps1 -h
- 2023/03/23 (v21.3.2)
- Fix issue reported: #439
v21.3.1
v21.3.0
- 2023/03/22 (v21.3.0)
- Add a function to load training config with
.toml
to each training script. Thanks to Linaqruf for this great contribution!- Specify
.toml
file with--config_file
..toml
file haskey=value
entries. Keys are same as command line options. See #241 for details. - All sub-sections are combined to a single dictionary (the section names are ignored.)
- Omitted arguments are the default values for command line arguments.
- Command line args override the arguments in
.toml
. - With
--output_config
option, you can output current command line options to the.toml
specified with--config_file
. Please use as a template.
- Specify
- Add
--lr_scheduler_type
and--lr_scheduler_args
arguments for custom LR scheduler to each training script. Thanks to Isotr0py! #271- Same as the optimizer.
- Add sample image generation with weight and no length limit. Thanks to mio2333! #288
( )
,(xxxx:1.2)
and[ ]
can be used.
- Fix exception on training model in diffusers format with
train_network.py
Thanks to orenwang! #290 - Add warning if you are about to overwrite an existing model: #404
- Add
--vae_batch_size
for faster latents caching to each training script. This batches VAE calls.- Please start with
2
or4
depending on the size of VRAM.
- Please start with
- Fix a number of training steps with
--gradient_accumulation_steps
and--max_train_epochs
. Thanks to tsukimiya! - Extract parser setup to external scripts. Thanks to robertsmieja!
- Fix an issue without
.npz
and with--full_path
in training. - Support extensions with upper cases for images for not Windows environment.
- Fix
resize_lora.py
to work with LoRA with dynamic rank (includingconv_dim != network_dim
). Thanks to toshiaki! - Fix issue: #406
- Add device support to LoRA extract.
- Add a function to load training config with