Sign In

Flash Attention 2 for ComfyUI

4

16

1

Updated: Aug 25, 2025

toolcomfyuiattention

Type

Other

Stats

15

0

Reviews

Published

Aug 20, 2025

Base Model

Other

Hash

AutoV2
7052D66D3C
default creator card background decoration
PabloFG's Avatar

PabloFG

https://github.com/Dao-AILab/flash-attention

Flash Attention 2, 2.8.3 precompiled for latest ComfyUI 3.50 Win.
Attention method, necessary for some models.
Couldn't find a wheel for Win, Python 3.13 Pytorch3.8.0,CU12.9 which what ComfyUI is using, so compiled myself one.
Is a long , annoying process that required a beefy PC, so I saved you some time.
When Comfy changes and I need a updated compile, I ll update here again if enough people find it useful..