Previous Special Year Seminar

Apr
07
2020

Theoretical Machine Learning Seminar

Interpolation in learning: steps towards understanding when overparameterization is harmless, when it helps, and when it causes harm
Anant Sahai
12:00pm|https://theias.zoom.us/j/384099138

A continuing mystery in understanding the empirical success of deep neural networks has been in their ability to achieve zero training error and yet generalize well, even when the training data is noisy and there are many more parameters than data...

Apr
02
2020

Theoretical Machine Learning Seminar

Learning Controllable Representations
12:00pm|https://theias.zoom.us/j/384099138

As deep learning systems become more prevalent in real-world applications it is essential to allow users to exert more control over the system. Exerting some structure over the learned representations enables users to manipulate, interpret, and even...

Mar
31
2020

Theoretical Machine Learning Seminar

Some Recent Insights on Transfer Learning
12:00pm|https://theias.zoom.us/j/384099138

A common situation in Machine Learning is one where training data is not fully representative of a target population due to bias in the sampling mechanism or high costs in sampling the target population; in such situations, we aim to ’transfer’...

Mar
26
2020

Theoretical Machine Learning Seminar

Margins, perceptrons, and deep networks
Matus Telgarsky
12:00pm|https://illinois.zoom.us/j/741628827

This talk surveys the role of margins in the analysis of deep networks. As a concrete highlight, it sketches a perceptron-based analysis establishing that shallow ReLU networks can achieve small test error even when they are quite narrow, sometimes...

Mar
11
2020

Theoretical Machine Learning Seminar

Improved Bounds on Minimax Regret under Logarithmic Loss via Self-Concordance
Blair Bilodaeu
4:00pm|Simonyi 101

We study sequential probabilistic prediction on data sequences which are not i.i.d., and even potentially generated by an adversary. At each round, the player assigns a probability distribution to possible outcomes and incurs the log-likelihood of...

Mar
10
2020

Theoretical Machine Learning Seminar

Your Brain on Energy-Based Models: Applying and Scaling EBMs to Problems of Interest to the Machine Learning Community Today
Will Grathwohl
12:00pm|Dilworth Room

In this talk, I will discuss my two recent works on Energy-Based Models. In the first work, I discuss how we can reinterpret standard classification architectures as class conditional energy-based models and train them using recently proposed...

Mar
05
2020

Theoretical Machine Learning Seminar

Understanding Deep Neural Networks: From Generalization to Interpretability
Gitta Kutyniok
12:00pm|Dilworth Room

Deep neural networks have recently seen an impressive comeback with applications both in the public sector and the sciences. However, despite their outstanding success, a comprehensive theoretical foundation of deep neural networks is still missing...

Mar
03
2020

Theoretical Machine Learning Seminar

What Noisy Convex Quadratics Tell Us about Neural Net Training
12:00pm|White-Levy

I’ll discuss the Noisy Quadratic Model, the toy problem of minimizing a convex quadratic function with noisy gradient observations. While the NQM is simple enough to have closed-form dynamics for a variety of optimizers, it gives a surprising amount...