Previous Conferences & Workshops

Apr
21
2020

Theoretical Machine Learning Seminar

Assumption-free prediction intervals for black-box regression algorithms
Aaditya Ramdas
12:00pm|https://theias.zoom.us/j/384099138

There has been tremendous progress in designing accurate black-box prediction methods (boosting, random forests, bagging, neural nets, etc.) but for deployment in the real world, it is useful to quantify uncertainty beyond making point-predictions...

Apr
21
2020

Computer Science/Discrete Mathematics Seminar II

Non-commutative optimization: theory, algorithms and applications (or, can we prove P!=NP using gradient descent)
10:30am|https://theias.zoom.us/j/360043913

This talk aims to summarize a project I was involved in during the past 5 years, with the hope of explaining our most complete understanding so far, as well as challenges and open problems. The main messages of this project are summarized below; I...

Apr
20
2020

Computer Science/Discrete Mathematics Seminar I

Structure vs Randomness in Complexity Theory
Rahul Santhanam
11:00am|https://theias.zoom.us/j/360043913

The dichotomy between structure and randomness plays an important role in areas such as combinatorics and number theory. I will discuss a similar dichotomy in complexity theory, and illustrate it with three examples of my own work: (i) An...

Apr
20
2020

Analysis Seminar

A variational approach to the regularity theory for the Monge-Ampère equation
Felix Otto
11:00am|https://theias.zoom.us/j/562592856

We present a purely variational approach to the regularity theory for the Monge-Ampère equation, or rather optimal transportation, introduced with M. Goldman. Following De Giorgi’s philosophy for the regularity theory of minimal surfaces, it is...

Apr
17
2020

Joint IAS/Princeton/Montreal/Paris/Tel-Aviv Symplectic Geometry Zoominar

Equivariant quantum operations and relations between them
Nicholas Wilkins
9:15am|https://princeton.zoom.us/j/745635914

There is growing interest in looking at operations on quantum cohomology that take into account symmetries in the holomorphic spheres (such as the quantum Steenrod powers, using a Z/p-symmetry). In order to prove relations between them, one needs to...

Apr
16
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

Steps towards more human-like learning in machines
Josh Tenenbaum
4:30pm|Virtual

There are several broad insights we can draw from computational models of human cognition in order to build more human-like forms of machine learning. (1) The brain has a great deal of built-in structure, yet still tremendous need and potential for...

Apr
16
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

Tradeoffs between Robustness and Accuracy
Percy Liang
3:15pm|Virtual

Standard machine learning produces models that are highly accurate on average but that degrade dramatically when the test distribution deviates from the training distribution. While one can train robust models, this often comes at the expense of...

Apr
16
2020

Joint IAS/Princeton University Number Theory Seminar

Local-global compatibility in the crystalline case
3:00pm|https://theias.zoom.us/j/959183254

Let F be a CM field. Scholze constructed Galois representations associated to classes in the cohomology of locally symmetric spaces for GL_n/F with p-torsion coefficients. These Galois representations are expected to satisfy local-global...

Apr
16
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

Modularity, Attention and Credit Assignment: Efficient information dispatching in neural computations
Anirudh Goyal
2:00pm|Virtual

Physical processes in the world often have a modular structure, with complexity emerging through combinations of simpler subsystems. Machine learning seeks to uncover and use regularities in the physical world. Although these regularities manifest...

Apr
16
2020

Workshop on New Directions in Optimization, Statistics and Machine Learning

The Peculiar Optimization and Regularization Challenges in Multi-Task Learning and Meta-Learning
Chelsea Finn
12:30pm|Virtual

Despite the success of deep learning, much of its success has existed in settings where the goal is to learn one, single-purpose function from data. However, in many contexts, we hope to optimize neural networks for multiple, distinct tasks (i.e...