SkillAgentSearch skills...

FACM

FACM: Flow-Anchored Consistency Models

Install / Use

/learn @ali-vilab/FACM
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<h2 align="center"> FACM: Flow-Anchored Consistency Models </h2> <h3 align="center"> 🔥 FACM outperforms 2×250-step Lightning-DiT on ImageNet 256 with only 2 steps </h3> <h3 align="center"> <span style="font-size: 1.2em; font-weight: bold;"><strong>FID=1.70 (1-step) &nbsp;&nbsp;&nbsp;&nbsp; FID=1.32 (2-step)</strong></span> </h3> <p align="center"> 📄 This is the official implementation of the paper: <br> <a href="https://arxiv.org/abs/2507.03738">Flow-Anchored Consistency Models</a> </p> <p align="center"> Yansong Peng, Kai Zhu, Yu Liu, Pingyu Wu, Hebei Li, Xiaoyan Sun, Feng Wu </p> <p align="center"> <img src="./cache/teaser.png" width="800"/> <br /> <em> </em> </p> <p align="center"> <strong>If you like FACM, please give us a ⭐! Your support motivates us to keep improving!</strong> </p>

Progress

  • [x] Release FACM on ImageNet 256
  • [ ] Release Wan 2.2 T2I + FACM 🚀

ImageNet 256 Performance on 8 × A100 GPUs

| Model | Steps | FID | IS | Epochs-Pretrain | Epochs-Distill | Download | |:-----:|:-----:|:---:|:--:|:---------------:|:--------------:|:--------:| | FACM | 2-step | 1.32 | 292 | 800 | 100 | 100ep-stg2.pt | | FACM | 1-step | 1.76 | 290 | 800 | 250 | 250ep-stg2.pt | | FACM | 1-step | 1.70 | 295 | 800 | 400 | 400ep-stg2.pt |

Quick Start

Prerequisites:

Download the required model weights and statistics files from HuggingFace or ModelScope to ./cache

Including: fid-50k-256.npz, latents_stats.pt, vavae-imagenet256-f16d32-dinov2.pt

Data Preparation

export DATA_PATH="/path/to/imagenet"
export OUTPUT_PATH="/path/to/latents"
bash scripts/extract.sh

*Note: You can also download pre-extracted ImageNet latents following Lightning-DiT.

Inference

pip install -e git+https://github.com/LTH14/torch-fidelity.git@master#egg=torch-fidelity

Download pretrained FACM model checkpoint 100ep-stg2.pt or 400ep-stg2.pt to ./cache

bash scripts/test.sh --ckpt-path cache/100ep-stg2.pt --sampling-steps 2
bash scripts/test.sh --ckpt-path cache/400ep-stg2.pt --sampling-steps 1

Training

Download pretrained FM model checkpoint 800ep-stg1.pt to ./cache

export DATA_PATH="/path/to/latents"
bash scripts/train.sh

Pretraining (Optional)

Replace configs/lightningdit_xl_vavae_f16d32.yaml and transport/transport.py of Lightning-DiT with our ldit/lightningdit_xl_vavae_f16d32.yaml and ldit/transport.py, then follow the instructions.

Reproductions

<details open> <summary> reproductions </summary>

We include reproductions of MeanFlow and sCM. Switch methods by changing the loss function in train.py line 81:

facm_loss = FACMLoss()      # FACM (default)
facm_loss = MeanFlowLoss()  # MeanFlow  
facm_loss = sCMLoss()       # sCM
</details>

Citation

If you use FACM or its methods in your work, please cite the following BibTeX entries:

<details open> <summary> bibtex </summary>
@misc{peng2025facm,
      title={Flow-Anchored Consistency Models}, 
      author={Yansong Peng and Kai Zhu and Yu Liu and Pingyu Wu and Hebei Li and Xiaoyan Sun and Feng Wu},
      year={2025},
      eprint={2507.03738},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}
</details>

Acknowledgements

The model architecture part is based on the Lightning-DiT repository.

✨ Feel free to contribute and reach out if you have any questions! ✨

Related Skills

View on GitHub
GitHub Stars144
CategoryDevelopment
Updated13d ago
Forks2

Languages

Python

Security Score

95/100

Audited on Mar 25, 2026

No findings