Is a 2080TI 12GB enough to train SDXL LoRAs?
Title. I've tried with Koyha in the past but DAMN that eats my entire PC. Are there optimizations or a different trainer webUI I can utilize? Thanks in advance.
2 Answers
Yes, you can. While I have 24GB of VRAM, I've had settings from kohya_ss that resulted in about 11GB being used. You need to adjust the settings, particularly the batch size (keep it at 1) and different optimizers will use more or less memory. I don't have an exact guide and I know good/consistent information is incredibly hard to find on training LoRAs. There's a lot of bad guides and info out there. People generally have no idea what they're doing, but their guides sound very authoratative in my experience.
So, I won't tell you that I'm an expert because I'm not. However, I can say that it is possible to train LoRAs with your GPU. I'll see if I can dig up some settings here for you to try.
For further illustration, this one guide is going on about using a GPU with 8GB of VRAM: https://rentry.org/lora-training-science
I tried on 3080 Ti (12Gb) with the different optimise keys and still has many problems, out of memory included. Waiting good guide how to train without 24 Gb too.