Wan2.1/wan/modules
Emanuele Bugliarello ca23a2fc59
Fix flash attention
fa3 latest version changed the return shape of the varlen func to be consistent w fa2. this pr fixes the fa3 attention call as done in https://github.com/Wan-Video/Wan2.2/pull/64
2025-08-27 11:43:35 +02:00
..
__init__.py [feature] Add VACE (#389) 2025-05-14 20:44:25 +08:00
attention.py Fix flash attention 2025-08-27 11:43:35 +02:00
clip.py init upload 2025-02-25 22:07:47 +08:00
model.py Format the code (#402) 2025-05-16 12:35:38 +08:00
t5.py init upload 2025-02-25 22:07:47 +08:00
tokenizers.py init upload 2025-02-25 22:07:47 +08:00
vace_model.py Format the code (#402) 2025-05-16 12:35:38 +08:00
vae.py init upload 2025-02-25 22:07:47 +08:00
xlm_roberta.py init upload 2025-02-25 22:07:47 +08:00