LazyDiT: Lazy Learning for the Acceleration of Diffusion Transformers

Authors

  • Xuan Shen Northeastern University
  • Zhao Song Adobe Research
  • Yufa Zhou University of Pennsylvania
  • Bo Chen Middle Tennessee State University
  • Yanyu Li Northeastern University
  • Yifan Gong Northeastern University
  • Kai Zhang Adobe Research
  • Hao Tan Adobe Research
  • Jason Kuen Adobe Research
  • Henghui Ding Fudan University
  • Zhihao Shu University of Georgia
  • Wei Niu University of Georgia
  • Pu Zhao Northeastern University
  • Yanzhi Wang Northeastern University
  • Jiuxiang Gu Adobe Research

DOI:

https://doi.org/10.1609/aaai.v39i19.34248

Abstract

Diffusion Transformers have emerged as the preeminent models for a wide array of generative tasks, demonstrating superior performance and efficacy across various applications. The promising results come at the cost of slow inference, as each denoising step requires running the whole transformer model with a large amount of parameters. In this paper, we show that performing the full computation of the model at each diffusion step is unnecessary, as some computations can be skipped by lazily reusing the results of previous steps. Furthermore, we show that the lower bound of similarity between outputs at consecutive steps is notably high, and this similarity can be linearly approximated using the inputs. To verify our demonstrations, we propose the **LazyDiT**, a lazy learning framework that efficiently leverages cached results from earlier steps to skip redundant computations. Specifically, we incorporate lazy learning layers into the model, effectively trained to maximize laziness, enabling dynamic skipping of redundant computations. Experimental results show that LazyDiT outperforms the DDIM sampler across multiple diffusion transformer models at various resolutions. Furthermore, we implement our method on mobile devices, achieving better performance than DDIM with similar latency.

Downloads

Published

2025-04-11

How to Cite

Shen, X., Song, Z., Zhou, Y., Chen, B., Li, Y., Gong, Y., Zhang, K., Tan, H., Kuen, J., Ding, H., Shu, Z., Niu, W., Zhao, P., Wang, Y., & Gu, J. (2025). LazyDiT: Lazy Learning for the Acceleration of Diffusion Transformers. Proceedings of the AAAI Conference on Artificial Intelligence, 39(19), 20409-20417. https://doi.org/10.1609/aaai.v39i19.34248

Issue

Section

AAAI Technical Track on Machine Learning V