santa hat
deerdeer nosedeer glow
Sign In

LoRA Block Weight extension in AUTOMATIC1111: weights, usage, XYZ plots for SDXL LoRA

LoRA Block Weight extension in AUTOMATIC1111: weights, usage, XYZ plots for SDXL LoRA

To use the "LoRA Block Weight" extension from https://github.com/hako-mikan/sd-webui-lora-block-weight in the latest versions of AUTOMATIC1111 as of 2023/09/22 you need to enter a lbw= keyword, and also use only 12 weights instead of the 17 that are defined by default. By updating the weights you can then use the XYZ plot to easily look at many different weight settings.

If you are using SDXL LoRA then the default weights don't quite work right because it lists 17 of them when there's only 12 in use for SDXL, and it doesn't appear to be truncating the extra 5 because even the ALL keyword, which is all 1's, looks different than pasting in just 12 ones instead of the 17, and using just 12 comes out identical to disabling the extension as expected.

According to the extension readme, the SDXL LoRA blocks are BASE IN04 IN05 IN07 IN08 MID OUT0 OUT1 OUT2 OUT3 OUT4 OUT5.

Using:

S: Shallow blocks, furthest from middle: IN04 IN05 OUT3 OUT4 OUT5

D: Deep blocks, closest to middle: IN07 IN08 OUT0 OUT1 OUT2

I remade the keywords:

NOT:0,0,0,0,0,0,0,0,0,0,0,0
ALL:1,1,1,1,1,1,1,1,1,1,1,1
INS:1,1,1,0,0,0,0,0,0,0,0,0
IND:1,0,0,1,1,0,0,0,0,0,0,0
INALL:1,1,1,1,1,0,0,0,0,0,0,0
MIDD:1,0,0,1,1,1,1,1,1,0,0,0
OUTD:1,0,0,0,0,0,1,1,1,0,0,0
OUTS:1,0,0,0,0,0,0,0,0,1,1,1
OUTALL:1,0,0,0,0,0,1,1,1,1,1,1
BASE:1,0,0,0,0,0,0,0,0,0,0,0
IN04:1,1,0,0,0,0,0,0,0,0,0,0
IN05:1,0,1,0,0,0,0,0,0,0,0,0
IN07:1,0,0,1,0,0,0,0,0,0,0,0
IN08:1,0,0,0,1,0,0,0,0,0,0,0
MID:1,0,0,0,0,1,0,0,0,0,0,0
OUT00:1,0,0,0,0,0,1,0,0,0,0,0
OUT01:1,0,0,0,0,0,0,1,0,0,0,0
OUT02:1,0,0,0,0,0,0,0,1,0,0,0
OUT03:1,0,0,0,0,0,0,0,0,1,0,0
OUT04:1,0,0,0,0,0,0,0,0,0,1,0
OUT05:1,0,0,0,0,0,0,0,0,0,0,1

Just paste that into the "Weights setting", then click Save Presets, then click Load Presets. Doing this will now allow the keywords to work and allows the XYZ plot to work with keywords.

Note first: AUTOMATIC1111 supports setting the text encoder weight separate from the unet weight, LoRA Block Weight extension requires both of them to be defined:

<lora:"lora name":1>
# Same as:
<lora:"lora name":1:1> # First is text encoder weight, second is unet weight
# Same as:
<lora:"lora name":te=1:unet=1> # You can also use te= and unet= to define them in any order

Here is the usage for the LoRA Block Weights extension:

<lora:"lora name":1:1:lbw=ALL>
# Same as:
<lora:"lora name":1:1:lbw=1,1,1,1,1,1,1,1,1,1,1,1>

<lora:"lora name":0.7>
<lora:"lora name":0.7:0.7:lbw=ALL>

<lora:"lora name":1:1:lbw=OUTS>
# Same as:
<lora:"lora name":1:1:lbw=1,0,0,0,0,0,0,0,0,1,1,1>

# Use XYZ function built into the extension:
<lora:"lora name":1:1:lbw=XYZ>

Directions for creating the XYZ plot can be found in this other guide: https://civitai.com/models/22581?modelVersionId=26964, the gist is to just put <lora:"lora name":1:1:lbw=XYZ> into the prompt, set the weights extension to Active, expand the XYZ Plot and click the "XYZ Plot" radio button (instead of leaving it "Disabled"), change "X Types" to "Original Weights", then paste comma separated keywords into "X Values", such as INS,IND,INALL,MIDD,OUTD,OUTS,OUTALL, then set Y and Z types to 'None', or something else if you want a full XYZ plot against other dimensions like seed and weight values instead of just seeing the different weights.

Further notes:

The author has stated that the BASE block is the text encoder weight: https://github.com/hako-mikan/sd-webui-lora-block-weight/issues/110#issuecomment-1708282404

In my testing, setting the base block to 0 produces a result very close but not identical to disabling it from the AUTOMATIC1111 <lora> handler:

<lora:"lora name":1:1:lbw=0,1,1,1,1,1,1,1,1,1,1,1>
# Very similar to:
<lora:"lora name":0:1>

Disabling the unet blocks from the extension does produce identical output to the AUTOMATIC1111 <lora> handler for me. Also note that the unet value is a multiplier against the extension's unet blocks, but produces slightly different output in my testing:

<lora:"lora name":1:0>
# Same as:
<lora:"lora name":1:1:lbw=1,0,0,0,0,0,0,0,0,0,0,0>
# Same as:
<lora:"lora name":1:0:lbw=1,1,1,1,1,1,1,1,1,1,1,1>

<lora:"lora name":1:0.5:lbw=1,1,1,1,1,1,1,1,1,1,1,1>
# Very similar to:
<lora:"lora name":1:1:lbw=1,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5>

Disclaimer: I have very little idea what I'm doing but googled for long enough to figure out how to get this extension working, if I got something wrong please let me know in the comments.

30

Comments