Previous Conferences & Workshops

Apr
16
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

Tradeoffs between Robustness and Accuracy
Percy Liang
3:15pm|Virtual

Standard machine learning produces models that are highly accurate on average but that degrade dramatically when the test distribution deviates from the training distribution. While one can train robust models, this often comes at the expense of...

Apr
16
2020

Joint IAS/Princeton University Number Theory Seminar

Local-global compatibility in the crystalline case
3:00pm|https://theias.zoom.us/j/959183254

Let F be a CM field. Scholze constructed Galois representations associated to classes in the cohomology of locally symmetric spaces for GL_n/F with p-torsion coefficients. These Galois representations are expected to satisfy local-global...

Apr
16
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

Modularity, Attention and Credit Assignment: Efficient information dispatching in neural computations
Anirudh Goyal
2:00pm|Virtual

Physical processes in the world often have a modular structure, with complexity emerging through combinations of simpler subsystems. Machine learning seeks to uncover and use regularities in the physical world. Although these regularities manifest...

Apr
16
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

The Peculiar Optimization and Regularization Challenges in Multi-Task Learning and Meta-Learning
Chelsea Finn
12:30pm|Virtual

Despite the success of deep learning, much of its success has existed in settings where the goal is to learn one, single-purpose function from data. However, in many contexts, we hope to optimize neural networks for multiple, distinct tasks (i.e...

Apr
16
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

Deep equilibrium models via monotone operators
Zico Kolter
11:15am|Virtual

In this talk, I will first introduce our recent work on the Deep Equilibrium Model (DEQ). Instead of stacking nonlinear layers, as is common in deep learning, this approach finds the equilibrium point of the repeated iteration of a single non-linear...

Apr
16
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

Do Simpler Models Exist and How Can We Find Them?
Cynthia Rudin
10:00am|Virtual

While the trend in machine learning has tended towards more complex hypothesis spaces, it is not clear that this extra complexity is always necessary or helpful for many domains. In particular, models and their predictions are often made easier to...

Apr
15
2020

Mathematical Conversations

Vignettes about pure mathematics and machine learning
Jordan Ellenberg
5:30pm|Remote Access Only

Through interactions with engineers and computer scientists over the years, including some current visitors at IAS, I have become pretty sold on the idea that machine learning is rich in questions which are interesting to pure mathematicians and...

Apr
15
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

Interpretability for everyone
Been Kim
3:15pm|Virtual

In this talk, I would like to share some of my reflections on the progress made in the field of interpretable machine learning. We will reflect on where we are going as a field, and what are the things that we need to be aware of to make progress...

Apr
15
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

Generative Modeling by Estimating Gradients of the Data Distribution
Stefano Ermon
2:00pm|Virtual

Existing generative models are typically based on explicit representations of probability distributions (e.g., autoregressive or VAEs) or implicit sampling procedures (e.g., GANs). We propose an alternative approach based on modeling directly the...