On the Memorization of Consistency Distillation for Diffusion Models

📰 ArXiv cs.AI

arXiv:2604.23552v1 Announce Type: cross Abstract: Diffusion models are central to modern generative modeling, and understanding how they balance memorization and generalization is critical for reliable deployment. Recent work has shown that memorization in diffusion models is shaped by training dynamics, with generalization and memorization emerging at different stages of training. However, deployed diffusion models are often further distilled, introducing an additional training phase whose impa

Published 28 Apr 2026
Read full paper → ← Back to Reads