Sign In

FLUX Block Weight Tester by maDcaDDie

27
96
7
Updated: Oct 27, 2024
tool
Type
Workflows
Stats
96
Reviews
Published
Oct 25, 2024
Base Model
Flux.1 D
Hash
AutoV2
6FDCAB2AAF
Project Odyssey Contest Participant
maDcaDDie's Avatar
maDcaDDie
The FLUX.1 [dev] Model is licensed by Black Forest Labs. Inc. under the FLUX.1 [dev] Non-Commercial License. Copyright Black Forest Labs. Inc.
IN NO EVENT SHALL BLACK FOREST LABS, INC. BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH USE OF THIS MODEL.

The realm of image generation has been transformed by diffusion models like FLUX, which excel at creating high-quality, detailed images. A significant enhancement to these models is the integration of Low-Rank Adaptations (LoRA), a technique that allows for efficient fine-tuning without the need to retrain massive networks from scratch.

LoRA is a method designed to adapt large neural networks efficiently by introducing trainable rank decomposition matrices into existing layers. Instead of updating all the parameters of a pre-trained model, LoRA focuses on injecting small, low-rank updates that require significantly fewer resources. This approach is particularly beneficial for large models where full fine-tuning would be computationally expensive.

In the context of the FLUX diffusion model, a LoRA is applied at the block level within the model's architecture. Specifically, FLUX LoRAs utilizes 19 Double Blocks and 38 Single Blocks with an additional input layer at the beginning. During LoRA training not all blocks are trained equally. depending on the concepts that is trained different blocks are trained more the others. Isolating these blocks and the only train them alone would result in a significant file size reduction, which allows higher network dim at reasonable filesizes and thus a high level in detail and quality.

This guide explains the use of my workflow to test the impact of LoRA Blocks for FLUX.

27.10.2024 V1.1 - fixed that the MSE for the double block 02 didn't show up

Special Thanks

Special Thanks go to BlackVortex for making customfit comfyui nodes that significantly improve this workflow.

Special Thanks go to denrakeiw on who's workflow this workflow is based on

Special Thanks go also to the community on the KiWelten Discord Channel.

Join us at: https://discord.gg/pAz4Bt3rqb

Lastly please share your findings and post your results to the workflow gallery so other people can reference them