Type | |
Stats | 150 |
Reviews | (16) |
Published | Oct 22, 2024 |
Base Model | |
Hash | AutoV2 82345130F0 |
LibreFLUX is a de-distillation of Flux Schnell, giving it quality on par with Dev, if not surpassing it, while maintaining the open Apache license. Currently it is undertrained but excels in prompt comprehension (improved with attention masking) and text. This is a Q5_k_m GGUF version for running on 12gb VRAM (maybe less, I have not checked). I will add more quants later.
To quote the original: LibreFLUX is an Apache 2.0 version of FLUX.1-schnell that provides a full T5 context length, uses attention masking, has classifier free guidance restored, and has had most of the FLUX aesthetic fine-tuning/DPO fully removed. That means it's a lot uglier than base flux, but it has the potential to be more easily finetuned to any new distribution. It keeps in mind the core tenets of open source software, that it should be difficult to use, slower and clunkier than a proprietary solution, and have an aesthetic trapped somewhere inside the early 2000s.
That being said the quality is amazing, Jimmycarter is underselling themselves heavily
Credit for the original LibreFLUX model goes to https://huggingface.co/jimmycarter/LibreFLUX