Sign In

Stop Labeling Your Lora-Merged Models as Trained

2
Stop Labeling Your Lora-Merged Models as Trained

A trained checkpoint is a finetuned model.

A full finetuned model is a model trained using a finetuning script, leveraging Kohya, OneTrainer, AI-Toolkit, or SimpleTrainer, on a finetuning dataset

A merged model is a base model, a finetuned model, or a merged model, that is:

  • merged with one or more checkpoints

  • merged with one or more lora's

  • merged with one or more models and one or more lora's

Claiming that a base model is suddenly a trained model because you merged it with loras is disingenuous and inaccurate.

It doesn't matter whether or not you trained those lora's yourself.

When you merge models, it's a merged model. If you did not perform a full fine-tuning of FLUX, you didn't make a trained model.

The only exception is if you trained a full model, then merged it with other existing models. You still actually trained a checkpoint as a part of your creation process.

2

Comments