Skip to main content

Convergence of flow-based generative models via proximal gradient descent in Wasserstein space

Publication ,  Journal Article
Cheng, X; Lu, J; Tan, Y; Xie, Y
Published in: IEEE Transactions on Information Theory
January 1, 2024

Flow-based generative models enjoy certain advantages in computing the data generation and the likelihood, and have recently shown competitive empirical performance. Compared to the accumulating theoretical studies on related score-based diffusion models, analysis of flow-based models, which are deterministic in both forward (data-to-noise) and reverse (noise-to-data) directions, remain sparse. In this paper, we provide a theoretical guarantee of generating data distribution by a progressive flow model, the so-called JKO flow model, which implements the Jordan-Kinderleherer-Otto (JKO) scheme in a normalizing flow network. Leveraging the exponential convergence of the proximal gradient descent (GD) in Wasserstein space, we prove the Kullback-Leibler (KL) guarantee of data generation by a JKO flow model to be (ε2) when using ≲ log(1/ε) many JKO steps ( Residual Blocks in the flow) where ε is the error in the per-step first-order condition. The assumption on data density is merely a finite second moment, and the theory extends to data distributions without density and when there are inversion errors in the reverse process where we obtain KL-2 mixed error guarantees. The non-asymptotic convergence rate of the JKO-type 2-proximal GD is proved for a general class of convex objective functionals that includes the KL divergence as a special case, which can be of independent interest. The analysis framework can extend to other first-order Wasserstein optimization schemes applied to flow-based generative models.

Duke Scholars

Published In

IEEE Transactions on Information Theory

DOI

EISSN

1557-9654

ISSN

0018-9448

Publication Date

January 1, 2024

Related Subject Headings

  • Networking & Telecommunications
  • 4613 Theory of computation
  • 4006 Communications engineering
  • 1005 Communications Technologies
  • 0906 Electrical and Electronic Engineering
  • 0801 Artificial Intelligence and Image Processing
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Cheng, X., Lu, J., Tan, Y., & Xie, Y. (2024). Convergence of flow-based generative models via proximal gradient descent in Wasserstein space. IEEE Transactions on Information Theory. https://doi.org/10.1109/TIT.2024.3422412
Cheng, X., J. Lu, Y. Tan, and Y. Xie. “Convergence of flow-based generative models via proximal gradient descent in Wasserstein space.” IEEE Transactions on Information Theory, January 1, 2024. https://doi.org/10.1109/TIT.2024.3422412.
Cheng X, Lu J, Tan Y, Xie Y. Convergence of flow-based generative models via proximal gradient descent in Wasserstein space. IEEE Transactions on Information Theory. 2024 Jan 1;
Cheng, X., et al. “Convergence of flow-based generative models via proximal gradient descent in Wasserstein space.” IEEE Transactions on Information Theory, Jan. 2024. Scopus, doi:10.1109/TIT.2024.3422412.
Cheng X, Lu J, Tan Y, Xie Y. Convergence of flow-based generative models via proximal gradient descent in Wasserstein space. IEEE Transactions on Information Theory. 2024 Jan 1;

Published In

IEEE Transactions on Information Theory

DOI

EISSN

1557-9654

ISSN

0018-9448

Publication Date

January 1, 2024

Related Subject Headings

  • Networking & Telecommunications
  • 4613 Theory of computation
  • 4006 Communications engineering
  • 1005 Communications Technologies
  • 0906 Electrical and Electronic Engineering
  • 0801 Artificial Intelligence and Image Processing