So, it was getting late here in France, here is the quick end to this story.
HandPatchV5 ended being a "Merge" at 1.07 strength of HandPatchV3 after some tests
But we are talking about a 870MB LoRA. That may be ok to merge in a checkpoint to fix stuff, not for a release as a "Fix". Let's try again to reduce the size to 32 dimensions now that the effect is locked in.
HandPatchRC is a 32 dimensions LoRA (217MB, that's better) and it did just enough fix without any specific tags for activation. That's because, when merging LoRA, the CLIP layers are also merged and this can mess up the activation part. But, do i really need it since it is basically doing not much?
Let's do some Python hacking in there:
>>> from safetensors.torch import load_file, save_file
>>> orig = load_file("handPatchRC.safetensors")
>>> new = {}
>>> for k in orig:
... if not k.startswith("lora_te"):
... new[k] = orig[k]
...
>>> save_file(new, "handPatchUnet.safetensors")
A LoRA contains like i explained several "patch" layers for UNET and CLIP. All UNET layers keys starts with "lora_unet" and the CLIP one starts with "lora_te1" and "lora_te2".
This way, i removed all CLIP layers to get HandPatchUNET, a 32 dimensions UNET-only LoRA (162 MB :D)
But does it deliver?
Feels like it does :D Let's test an other pose:
OOOH, it does seems like it can just clean hands without too much trouble :D
So, I'll do a few more tests and release it (and maybe do a V2.1 of AlterBan to avoid keeping around a checkpoint with bad hands included 😅)
Thanks for reading!