fixed docs and requirements

This commit is contained in:
deepbeepmeep 2025-08-14 10:08:47 +02:00
parent c495c41ad7
commit f4537e0cf2
3 changed files with 13 additions and 66 deletions

View File

@ -233,7 +233,7 @@ git clone https://github.com/deepbeepmeep/Wan2GP.git
cd Wan2GP
conda create -n wan2gp python=3.10.9
conda activate wan2gp
pip install torch==2.7.0 torchvision==0.22.0 torchaudio==2.7.0 --extra-index-url https://download.pytorch.org/whl/cu124
pip install torch==2.7.0 torchvision torchaudio --index-url https://download.pytorch.org/whl/test/cu128
pip install -r requirements.txt
```

View File

@ -8,7 +8,7 @@ This guide covers installation for different GPU generations and operating syste
- Conda or Python venv
- Compatible GPU (RTX 10XX or newer recommended)
## Installation for RTX 10XX to RTX 40XX (Stable)
## Installation for RTX 10XX to RTX 50XX (Stable)
This installation uses PyTorch 2.7.0 which is well-tested and stable.
@ -28,7 +28,7 @@ conda activate wan2gp
```shell
# Install PyTorch 2.7.0 with CUDA 12.4
pip install torch==2.7.0 torchvision==0.22.0 torchaudio==2.7.0 --extra-index-url https://download.pytorch.org/whl/cu124
pip install torch==2.7.0 torchvision torchaudio --index-url https://download.pytorch.org/whl/test/cu128
```
### Step 3: Install Dependencies
@ -40,7 +40,7 @@ pip install -r requirements.txt
### Step 4: Optional Performance Optimizations
#### Sage Attention (30% faster)
#### Sage Attention (30% faster), don't install with RTX 50xx as it is not compatible
```shell
# Windows only: Install Triton
@ -71,62 +71,7 @@ pip install -e .
pip install flash-attn==2.7.2.post1
```
## Installation for RTX 50XX (Beta)
RTX 50XX GPUs require PyTorch 2.7.0 (beta). This version may be less stable.
⚠️ **Important:** Use Python 3.10 for compatibility with pip wheels.
### Step 1: Setup Environment
```shell
# Clone and setup (same as above)
python -m pip install "setuptools<=75.8.2" --force-reinstall
git clone https://github.com/deepbeepmeep/Wan2GP.git
cd Wan2GP
conda create -n wan2gp python=3.10.9
conda activate wan2gp
```
### Step 2: Install PyTorch Beta
```shell
# Install PyTorch 2.7.0 with CUDA 12.8
pip install torch==2.7.0 torchvision torchaudio --index-url https://download.pytorch.org/whl/test/cu128
```
### Step 3: Install Dependencies
```shell
pip install -r requirements.txt
```
### Step 4: Optional Optimizations for RTX 50XX
#### Sage Attention
```shell
# Windows
pip install triton-windows
pip install sageattention==1.0.6
# Linux
pip install sageattention==1.0.6
```
#### Sage 2 Attention
```shell
# Windows
pip install triton-windows
pip install https://github.com/woct0rdho/SageAttention/releases/download/v2.1.1-windows/sageattention-2.1.1+cu128torch2.7.0-cp310-cp310-win_amd64.whl
# Linux (manual compilation)
git clone https://github.com/thu-ml/SageAttention
cd SageAttention
pip install -e .
```
## Attention Modes
WanGP supports several attention implementations:
@ -136,6 +81,12 @@ WanGP supports several attention implementations:
- **Sage2**: 40% speed boost
- **Flash**: Good performance, may be complex to install on Windows
### Attention GPU Compatibility
- RTX 10XX, 20XX: SDPA
- RTX 30XX, 40XX: SDPA, Flash Attention, Xformers, Sage, Sage2
- RTX 50XX: SDPA, SDPA, Flash Attention, Xformers, Sage2
## Performance Profiles
Choose a profile based on your hardware:
@ -163,10 +114,5 @@ If Sage attention doesn't work:
- Use Profile 4 for lower VRAM usage
- Consider using 1.3B models instead of 14B models
### GPU Compatibility
- RTX 10XX, 20XX: Supported with SDPA attention
- RTX 30XX, 40XX: Full feature support
- RTX 50XX: Beta support with PyTorch 2.7.0
For more troubleshooting, see [TROUBLESHOOTING.md](TROUBLESHOOTING.md)

View File

@ -11,7 +11,7 @@ sentencepiece
open_clip_torch>=2.29.0
# Video & media
moviepy @ git+https://github.com/Zulko/moviepy.git
moviepy==1.0.3
av
ffmpeg-python
pygame>=2.1.0
@ -45,6 +45,7 @@ torchdiffeq>=0.2.5
tensordict>=0.6.1
mmgp==3.5.10
peft==0.15.0
matplotlib
# Utilities
ftfy