As requested by @denrakeiw
Here is an XYZ for
🚗 We Are Electric ⚡
Loras:
Lora FA <lora:FF-We-Are-Electric-XL-Cr2-v0564-FA:1>
Module:
networks.lora_fa
Dim:
64
Alpha:
64.0
Lycoris - preset FULL <lora:WeAreElectric-FFai-LycoLora:1>
Module:
lycoris.kohya
Resolution:
1024x1024
Preset:
full
Algo:
lora
Dropout:
0.0
Lycroris LoKR preset FULL with SDXL v-prediction <lora:WeAreElectric-FFai-LoKr-vpred:1>
Preset:
full
Algo:
lokr
Factor:
8
Dim:
100,000,000
Alpha:
0.69
LoRA - High Alpha Experiment (1.7GB oversized lora)
Module:
networks.lora
Dim:
256
Alpha:
512.0
CivitAI Lora <lora:We_are_Electric:1>
Resolution:
1024x1024
Architecture:
stable-diffusion-xl-v1-base/lora
Learning Rate:
0.0005
UNet LR:
0.0005
TE LR:
5e-05
Network Dim/Rank: 32.0 Alpha: 1.0
Optimizer: bitsandbytes.optim.adamw.AdamW8bit(weight_decay=0.1)
Epoch: 15
Batches per epoch: 455
Gradient accumulation steps: 1
Train images: 1820
Regularization images: 0
Multires noise iterations: 6.0
Multires noise discount: 0.3
Min SNR gamma: 5.0
UNet weight average magnitude: 4.496516309802129
UNet weight average strength: 0.016011949140159916
Text Encoder (1) weight average magnitude: 1.8189308640704058
Text Encoder (1) weight average strength: 0.009030077111409775
Text Encoder (2) weight average magnitude: 1.835224805119421
Text Encoder (2) weight average strength: 0.007039581994775502
FFusion LoRA <lora:FFusion-We-Are-Electric.64LORA:1>:
Description: Merged and scaled using dynamic weights from Loras 1-5.