Emanuele Bugliarello
ca23a2fc59
Fix flash attention
...
fa3 latest version changed the return shape of the varlen func to be consistent w fa2. this pr fixes the fa3 attention call as done in https://github.com/Wan-Video/Wan2.2/pull/64
2025-08-27 11:43:35 +02:00
Ang Wang
76e9427657
Format the code ( #402 )
...
* isort the code
* format the code
* Add yapf config file
* Remove torch cuda memory profiler
2025-05-16 12:35:38 +08:00
Ang Wang
18d53feb7a
[feature] Add VACE ( #389 )
...
* Add VACE
* Support training with multiple gpus
* Update default args for vace task
* vace block update
* Add vace exmaple jpg
* Fix dist vace fwd hook error
* Update vace exmample
* Update vace args
* Update pipeline name for vace
* vace gradio and Readme
* Update vace snake png
---------
Co-authored-by: hanzhn <han.feng.jason@gmail.com>
2025-05-14 20:44:25 +08:00
yupeng1111
df44622e72
[feature] Wan2.1-FLF2V-14B ( #338 )
...
Co-authored-by: 澎鹏 <shiyupeng.syp@taobao.com>
2025-04-17 21:56:46 +08:00
WanX-Video-1
65386b2e03
init upload
2025-02-25 22:07:47 +08:00