This repository provides true BF16 downcasts and scaled FP8 quantizations for the official Wan2.2-Distill-Models (260412 version) by LightX2V. Wan2.2-Distill-Models

πŸ—‚οΈ Which version should I download?

1. The BF16 Base Models (Maximum Quality)

These were programmatically downcast directly from the original 57GB FP32 weights. They retain the exact same dynamic range and visual fidelity as the originals but are half the size (~28.6 GB). BF16 prevents the math overflow errors (black screens) common in standard FP16.

  • wan2.2_i2v_A14b_high_noise_lightx2v_4step_720p_260412_bf16.safetensors
  • wan2.2_i2v_A14b_low_noise_lightx2v_4step_720p_260412_bf16.safetensors

2. The FP8 Base Models (VRAM Friendly)

These are scaled FP8 versions. They shrink the model footprint down to ~15 GB while retaining excellent visual quality, making it possible to run this 14B parameter video model locally without instantly crashing your system.

  • wan2.2_i2v_A14b_high_noise_lightx2v_4step_720p_260412_fp8.safetensors
  • wan2.2_i2v_A14b_low_noise_lightx2v_4step_720p_260412_fp8.safetensors

3. The Standalone LoRAs (Unmodified)

Best for: Users who already have the base, un-distilled Wan 2.2 models downloaded and just want to patch them for 4-step generation.

  • wan2.2_i2v_A14b_high_noise_lora_lightx2v_4step_720p_260412.safetensors
  • wan2.2_i2v_A14b_low_noise_lora_lightx2v_4step_720p_260412.safetensors

πŸš€ Usage Guide (ComfyUI)

For the easiest plug-and-play experience in ComfyUI:

  1. Download the High Noise and Low Noise models of your chosen precision (FP8 for local, BF16 for cloud).
  2. You do not need the standalone LoRAs if you download the Base Models. The 4-step distillation speedhack is already permanently baked into both the BF16 and FP8 files provided here.

πŸ† Credits & Acknowledgements

  • Original Wan 2.2 Framework: [Wan 2.2 Team]
  • Distillation & Original LoRAs: LightX2V
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for Abiray/Wan2.2-LightX2V-260412-4STEP-FP8-BF16

Finetuned
(1)
this model

Collection including Abiray/Wan2.2-LightX2V-260412-4STEP-FP8-BF16