Sign In

FlashAttention VS Test

2

36

0

Updated: Aug 25, 2025

toolattentionflash attention

Type

Other

Stats

20

0

Reviews

Published

Aug 25, 2025

Base Model

Other

Hash

AutoV2
663530A131
SDXL Training Contest Participant
Felldude's Avatar

Felldude

FlashAttention VS Test

  • Test and Graphs attentions based on your system.

  • TO DO ADD Sage Attention

  • Comfy UI Users I recommend:

cmd /k python main.py --fp32-text-enc --fp32-vae --bf16-unet --use-flash-attention

Note: The compiled version of flash attention also included is for Cuda 12.9

(Tested Working with COMFY UI)