Sign In

Flux Block Weight Remerger - Tool

15
34
6
Updated: Oct 26, 2024
toolblocksweight blocks
Type
Other
Stats
34
Reviews
Published
Oct 23, 2024
Base Model
Flux.1 D
Hash
AutoV2
1AAB1CA1DB
Oni Badge
diogod's Avatar
diogod
The FLUX.1 [dev] Model is licensed by Black Forest Labs. Inc. under the FLUX.1 [dev] Non-Commercial License. Copyright Black Forest Labs. Inc.
IN NO EVENT SHALL BLACK FOREST LABS, INC. BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH USE OF THIS MODEL.

I couldn't find a tool to save a LoRa with adjusted block weights, so I created my own.

A Python tool to filter, adjust, and optionally remove Flux block weights from a LoRA.

Note: I'm not a dev, so this implementation might be wrong. It's all basically ChatGPT being guided.

https://github.com/diodiogod/Flux-Block-Weight-Remerger

Features

  • Adjust Weights: Adjust weights for blocks and layers according to provided values (19 or 57 comma separated format). see: nihedon/sd-webui-lora-block-weight#2 (comment)

  • Zero-out Weights: Optionally remove layers that have their weights set to zero.

    I don't know if removing leads to problems. It defaults to keep all layers.

  • Filter Layers: Select and adjust specific layers based on keywords: 'lora_B', 'lora_A', 'proj_mlp', 'proj_out', 'attn', 'norm' - the default is all 'lora_B' layers.

    I have no idea if this is correct, but it's what gave me in my testings the closest result to changing the block weights in Forge or ComfyUi.

  • Presets: You can save as many presets as you want on the preset_options.txt file. 19 or 57 format.

  • Log: Keep track of all loras adjusted on a log.csv file.

  • Windows_start.bat to automatically install, update and run