Convergence of Flow-Based Generative Models via Proximal Gradient Descent in Wasserstein Space
Flow-based generative models enjoy certain advantages in computing the data generation and the likelihood, and have recently shown competitive empirical performance. Compared to the accumulating theoretical studies on related score-based diffusion models, analysis of flow-based models, which are deterministic in both forward (data-to-noise) and reverse (noise-to-data) directions, remain sparse. In this paper, we provide a theoretical guarantee of generating data distribution by a progressive flow model, the so-called JKO flow model, which implements the Jordan-Kinderleherer-Otto (JKO) scheme in a normalizing flow network. Leveraging the exponential convergence of the proximal gradient descent (GD) in Wasserstein space, we prove the Kullback-Leibler (KL) guarantee of data generation by a JKO flow model to be O(ɛ2) when using N ≲ log(1/ɛ) many JKO steps (N Residual Blocks in the flow) where ɛ is the error in the per-step first-order condition. The assumption on data density is merely a finite second moment, and the theory extends to data distributions without density and when there are inversion errors in the reverse process where we obtain KL-W2 mixed error guarantees. The non-asymptotic convergence rate of the JKO-type W
Duke Scholars
Published In
DOI
EISSN
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- Networking & Telecommunications
- 4613 Theory of computation
- 4006 Communications engineering
- 1005 Communications Technologies
- 0906 Electrical and Electronic Engineering
- 0801 Artificial Intelligence and Image Processing
Citation
Published In
DOI
EISSN
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- Networking & Telecommunications
- 4613 Theory of computation
- 4006 Communications engineering
- 1005 Communications Technologies
- 0906 Electrical and Electronic Engineering
- 0801 Artificial Intelligence and Image Processing