I couldn't find a tool to save a LoRa with adjusted block weights, so I created my own.
A Python tool to filter, adjust, and optionally remove Flux block weights from a LoRA.
Note: I'm not a dev, so this implementation might be wrong. It's all basically ChatGPT being guided.
https://github.com/diodiogod/Flux-Block-Weight-Remerger
Features
Adjust Weights: Adjust weights for blocks and layers according to provided values (19 or 57 comma separated format). see: nihedon/sd-webui-lora-block-weight#2 (comment)
Zero-out Weights: Optionally remove layers that have their weights set to zero.
I don't know if removing leads to problems. It defaults to keep all layers.
Filter Layers: Select and adjust specific layers based on keywords: 'lora_B', 'lora_A', 'proj_mlp', 'proj_out', 'attn', 'norm' - the default is all 'lora_B' layers.
I have no idea if this is correct, but it's what gave me in my testings the closest result to changing the block weights in Forge or ComfyUi.
Presets: You can save as many presets as you want on the preset_options.txt file. 19 or 57 format.
Log: Keep track of all loras adjusted on a log.csv file.
Windows_start.bat to automatically install, update and run