Optimal Stability of KL Divergence under Gaussian Perturbations

📰 ArXiv cs.AI

arXiv:2604.11026v1 Announce Type: cross Abstract: We study the problem of characterizing the stability of Kullback-Leibler (KL) divergence under Gaussian perturbations beyond Gaussian families. Existing relaxed triangle inequalities for KL divergence critically rely on the assumption that all involved distributions are Gaussian, which limits their applicability in modern applications such as out-of-distribution (OOD) detection with flow-based generative models. In this paper, we remove this rest

Published 14 Apr 2026
Read full paper → ← Back to Reads