Skip to main content

SADA: Stability-guided Adaptive Diffusion Acceleration

Publication ,  Conference
Jiang, T; Wang, Y; Ye, H; Shao, Z; Sun, J; Zhang, J; Chen, Z; Chen, Y; Li, H
Published in: Proceedings of Machine Learning Research
January 1, 2025

Diffusion models have achieved remarkable success in generative tasks but suffer from high computational costs due to their iterative sampling process and quadratic-attention costs. Existing training-free acceleration strategies that reduce per-step computation cost, while effectively reducing sampling time, demonstrate low faithfulness compared to the original baseline. We hypothesize that this fidelity gap arises because (a) different prompts correspond to varying denoising trajectory, and (b) such methods do not consider the underlying ODE formulation and its numerical solution. In this paper, we propose Stability-guided Adaptive Diffusion Acceleration (SADA), a novel paradigm that unifies step-wise and token-wise sparsity decisions via a single stability criterion to accelerate sampling of ODE-based generative models (Diffusion and Flow-matching). For (a), SADA adaptively allocates sparsity based on the sampling trajectory. For (b), SADA introduces principled approximation schemes that leverage the precise gradient information from the numerical ODE solver. Comprehensive evaluations on SD-2, SDXL, and Flux using both EDM and DPM++ solvers reveal consistent ≥ 1.8× speedups with minimal fidelity degradation (LPIPS ≤ 0.10 and FID ≤ 4.5) compared to unmodified baselines, significantly outperforming prior methods. Moreover, SADA adapts seamlessly to other pipelines and modalities: It accelerates ControlNet without any modifications and speeds up MusicLDM by 1.8× with ∼ 0.01 spectrogram LPIPS. Our code is available at: https://github.com/Ting-Justin-Jiang/sadaicml.

Duke Scholars

Published In

Proceedings of Machine Learning Research

EISSN

2640-3498

Publication Date

January 1, 2025

Volume

267

Start / End Page

27649 / 27669
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Jiang, T., Wang, Y., Ye, H., Shao, Z., Sun, J., Zhang, J., … Li, H. (2025). SADA: Stability-guided Adaptive Diffusion Acceleration. In Proceedings of Machine Learning Research (Vol. 267, pp. 27649–27669).
Jiang, T., Y. Wang, H. Ye, Z. Shao, J. Sun, J. Zhang, Z. Chen, Y. Chen, and H. Li. “SADA: Stability-guided Adaptive Diffusion Acceleration.” In Proceedings of Machine Learning Research, 267:27649–69, 2025.
Jiang T, Wang Y, Ye H, Shao Z, Sun J, Zhang J, et al. SADA: Stability-guided Adaptive Diffusion Acceleration. In: Proceedings of Machine Learning Research. 2025. p. 27649–69.
Jiang, T., et al. “SADA: Stability-guided Adaptive Diffusion Acceleration.” Proceedings of Machine Learning Research, vol. 267, 2025, pp. 27649–69.
Jiang T, Wang Y, Ye H, Shao Z, Sun J, Zhang J, Chen Z, Chen Y, Li H. SADA: Stability-guided Adaptive Diffusion Acceleration. Proceedings of Machine Learning Research. 2025. p. 27649–27669.

Published In

Proceedings of Machine Learning Research

EISSN

2640-3498

Publication Date

January 1, 2025

Volume

267

Start / End Page

27649 / 27669