Workshop on New Directions in Optimization, Statistics and Machine Learning

Generative Modeling by Estimating Gradients of the Data Distribution

Existing generative models are typically based on explicit representations of probability distributions (e.g., autoregressive or VAEs) or implicit sampling procedures (e.g., GANs). We propose an alternative approach based on modeling directly the vector field of gradients of the data distribution (scores). Our framework allows flexible energy-based model architectures, requires no sampling during training or the use of adversarial training methods. Using annealed Langevin dynamics, we produces samples comparable to GANs on MNIST, CelebA and CIFAR-10 datasets, achieving a new state-of-the-art inception score of 8.91 on CIFAR-10. Finally, I will discuss challenges in evaluating bias and generalization in generative models.

Date & Time

April 15, 2020 | 2:00pm – 3:00pm

Location

Virtual

Speakers

Stefano Ermon

Affiliation

Stanford University

Categories