Updated: Feb 27, 2025
base modelHere is my quantized FP8 Version of the Wan2.1_14B 720p i2v model so that we can run it on 50 and 40 series or for even faster inference on bigger cards.
Enjoy!
Updated: Feb 27, 2025
base modelHere is my quantized FP8 Version of the Wan2.1_14B 720p i2v model so that we can run it on 50 and 40 series or for even faster inference on bigger cards.
Enjoy!