Monotone Generative Modeling via a Gromov-Monge Embedding
Generative adversarial networks are popular for generative tasks; however‚ they often require careful architecture selection and extensive empirical tuning‚ and they are prone to mode collapse. To overcome these challenges‚ we propose a novel model that identifies the low-dimensional structure of the underlying data distribution‚ maps it into a low-dimensional latent space while preserving the underlying geometry‚ and then optimally transports a reference measure to the embedded distribution. We prove three key properties of our method: (1) the encoder preserves the geometry of the underlying data; (2) the generator is c-cyclically monotone‚ where c is an intrinsic embedding cost employed by the encoder; and (3) the discriminator's modulus of continuity improves with the geometric preservation of the data. Numerical experiments demonstrate the effectiveness of our approach in generating high-quality images and exhibiting robustness to both mode collapse and training instability.
Duke Scholars
Published In
DOI
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- 49 Mathematical sciences
- 46 Information and computing sciences
Citation
Published In
DOI
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- 49 Mathematical sciences
- 46 Information and computing sciences