Type | |
Stats | 953 30,902 3k |
Reviews | (183) |
Published | Feb 4, 2024 |
Base Model | |
Training | Steps: 4,000 Epochs: 6 |
Usage Tips | Clip Skip: 2 Strength: 0.8 |
Trigger Words | tekuult |
Hash | AutoV2 8D02EA3042 |
Shoutout to Tekuho, who this Lora is inspired by! Please, check out his awesome work, and shower him with support!
https://www.patreon.com/user?u=6714576
Introducing Teku 4 PonyXL. A style lora for PonyXL (v6) that attempts to emulate the style of Tekuho, similar to my 1.5 model. To varying degrees of success.
Simply do the normal "score_9" Pony shenanigans, and add the lora to your prompt (weights 0.8-1), optionally add "tekuult" as well. For stability, add "source_furry", "source_pony" to the negative prompt. "source_anime" is not needed, and in all likelihood will make gens worse. I've found that certain negative TI embeds also can skew the style by quite a bit.
Personally, I think it's a little off, but I'm not really sure if that's the fault of the lora itself, of Pony XL specifically. The jury's still out on this one. When it works, it works, but it also can differ greatly depending on the seed. As Pony does this natively, I'm thinking it's because of that. There's probably a combination of artist tags that might stabilize this, but I'm still experimenting with Pony. If someone finds a nice mix of artist tags that stabilizes the style, please put them in the comments, and I'll place them here as well!
From my experimentation as well, it seems to successfully apply itself to subjects outside the training data, such as furries and the like. Which is pretty neat. Obviously, you'll have to add/remove the relevant "source" tags to the positives/negatives.
Dataset is the exact same from the 1.5 v3 Lora, and I've done what I can to viably train this lora on XL with a 1024 resolution. I'm not entirely sure of the procedure to train XL models, nor am I totally sure what works with Pony, but I've done what I can.
I'm not sure if I'll train another version of this, as I'd like to explore different subjects for a bit, but we'll see.