Type | Workflows |
Stats | 229 |
Reviews | (37) |
Published | Sep 11, 2024 |
Base Model | |
Hash | AutoV2 C2C4C551DD |
A simple and quick lora trainer setup, I got 1it/s after 5min of training on a 512px training, batch 1, so it's pretty fast.
You can use flux1_devFP8Kijai11GB.safetensors as well as the regular flux1-dev.safetensors model,
flux1-dev is slightly faster but take abit more time to load, the VRAM usage is the same, around 16.3 GB of VRAM, set the HighVram to false doesn't change anything on this side.
Set the "Split Mode" to "true," will use a maxium of 41% of 24GB, so around 10GB of VRAM usage,
Use it only If you have less than 16GB, If you have 16GB you should try "false" before, and see if you got an OOM. Setting this to "true" and the training will last 110 minutes instead of 20 on this particular example.
600 steps at 512 resolution, took me 15 min to train