Can we prune safetensors models to save disk space?
I've been addicted to downloading new models from civitai all day long, and I'm having lots of fun. However, my SSD begins to complain as I keep filling it up with new huge models. I know we can prune CKPT files to reduce models size, but now most of them are in safetensors format (and I want to keep them that way for safety and speed reasons).
Do you know if there is a way to prune safetensors files, like ckpt files, and keep them under reasonable size?
Thank you very much!
Hi! I haven't tested, but perhaps using the model converter extension for Auto1111 might help? It allows the conversion from ckpt to safetensor, and the reverse, and also allows pruning.
It's a little clunky, but if you can't prune a safetensor from there, you may be able to convert to ckpt, prune, then convert back!
Very good suggestions, thank you very much for taking the time to reply. Cheers! 🙏
Edit: For people interested, I had good results with the model converter extension of Automatc1111's UI. Basically, you can use a ckpt or safetensors source file format, it doesn't matter. Then use the following options:
safetensors checkpoint format
You will then obtain 4 GB models. You can even reduce the file sizes to 2 GB if you pick the "fp16" option instead (but then the images seem a bit simpler, so for now I'll keep the fp32 files, to retain models fine tuning). Picking the "ema-only" option gives drastically different images from the original model, with the same seed and prompt, so I think it alters the models too much.
If you use the model merging feature (not an extension, default feature), you can choose the same Checkpoint for Model A and Model B, then select 0 on weighted sum, with Safetensors and FP16 selected (if you want the half sized models)
If quality loss concerns you, keep a copy of your .ckpt version.
Of course, having a 2.5GB file & a 5GB file is not going to save any space,
but you can work with/share the smaller .safetensors and archive the full sized Checkpoints for later.