Seminars Sorted by Series

The Iwasawa Main Conjecture for GL(2) Mini-Course

Theoretical Machine Learning Seminar

Oct
02
2017

Theoretical Machine Learning Seminar

Hyperparameter optimization: a spectral approach
Elad Hazan
12:30pm

Modern machine learning algorithms often involve complicated models with tunable parameters, called hyperparameters, that are set during training. Hyperparameter tuning/optimization is considered an art. (See e.g. http://www.argmin.net/2016/06/20...

Oct
16
2017

Theoretical Machine Learning Seminar

Keeping IT cool: machine learning for data center cooling
Nevena Lazic
12:30pm

I will describe recent efforts in using machine learning to control cooling in Google data centers, as well as related research in linear quadratic control.

Nov
06
2017

Theoretical Machine Learning Seminar

Naturalizing a programming language
12:30pm

Our goal is to create a convenient natural language interface for performing well-specified but complex actions such as analyzing data, manipulating text, and querying databases. However, existing natural language interfaces for such tasks are quite...

Nov
13
2017

Theoretical Machine Learning Seminar

Towards a better understanding of neural networks: learning dynamics, interpretability and RL generalization
Maithra Raghu
12:30pm|White-Levy Room

With the continuing successes of deep learning, it becomes increasingly important to better understand the phenomena exhibited by these models, ideally through a combination of systematic experiments and theory. In this talk I discuss some of our...

Nov
27
2017

Theoretical Machine Learning Seminar

Beyond log-concavity: provable guarantees for sampling multi-modal distributions using simulated tempering Langevin Monte Carlo
Holden Lee
12:30pm|White-Levy Room

A fundamental problem in Bayesian statistics is sampling from distributions that are only specified up to a partition function (constant of proportionality). In particular, we consider the problem of sampling from a distribution given access to the...

Dec
11
2017

Theoretical Machine Learning Seminar

Learning with little data
12:30pm|White-Levy Room

The current successes of deep neural networks have largely come on classification problems, based on datasets containing hundreds of examples from each category. Humans can easily learn new words or classes of visual objects from very few examples...

Jan
25
2018

Theoretical Machine Learning Seminar

Prediction and Control of Linear Dynamical Systems
Cyril Zhang
12:15pm|White-Levy

Linear dynamical systems (LDSs) are a class of time-series models widely used in robotics, finance, engineering, and meteorology. I will present our "spectral filtering" approach to the identification and control of discrete-time LDSs with multi...

Feb
01
2018

Theoretical Machine Learning Seminar

Two approaches to (Deep) Learning with Differential Privacy
Kunal Talwar
12:15pm|White-Levy Room

Machine learning techniques based on neural networks are achieving remarkable results in a wide variety of domains. Often, the training of models requires large, representative datasets, which may be crowd-sourced and contain sensitive information...

Feb
22
2018

Theoretical Machine Learning Seminar

On the Optimization of Deep Networks: Implicit Acceleration by Overparameterization
12:15pm|White-Levy Room

Conventional wisdom in deep learning states that increasing depth improves expressiveness but complicates optimization. In this talk I will argue that, sometimes, increasing depth can speed up optimization.

The effect of depth on optimization is...

Mar
01
2018

Theoretical Machine Learning Seminar

Small-loss bounds for online learning with partial information
Thodoris Lykouris
12:15pm|White-Levy Room

We consider the problem of adversarial (non-stochastic) online learning with partial information feedback, where at each round, a decision maker selects an action from a finite set of alternatives. We develop a black-box approach for such problems...

Apr
05
2018

Theoretical Machine Learning Seminar

A Compressed Sensing View of Unsupervised Text Embeddings, Bag-of-n-Grams, and LSTMs
Mikhail Khodak
12:15pm|White-Levy Room

Three fundamental factors determine the quality of a statistical learning algorithm: expressiveness, optimization and generalization. The classic strategy for handling these factors is relatively well understood. In contrast, the radically different...

Apr
12
2018

Theoretical Machine Learning Seminar

Stability and Generalization in Adaptive Data Analysis
Vitaly Feldman
12:15pm|White-Levy Room

Datasets are often used multiple times with each successive analysis depending on the outcomes of previous analyses on the same dataset. Standard techniques for ensuring generalization and statistical validity do not account for this adaptive...

Apr
19
2018

Theoretical Machine Learning Seminar

Online Improper Learning with an Approximation Oracle
Zhiyuan Li
12:15pm|White-Levy Room

We revisit the question of reducing online learning to approximate optimization of the offline problem. In this setting, we give two algorithms with near-optimal performance in the full information setting: they guarantee optimal regret and require...

Oct
01
2018

Theoretical Machine Learning Seminar

Structured Learning with Parsimony in Measurements and Computations: Theory, Algorithms, and Applications
Xingguo Li
12:15pm|White Levy Room

In modern “Big Data” applications, structured learning is the most widely employed methodology. Within this paradigm, the fundamental challenge lies in developing practical, effective algorithmic inference methods. Often (e.g., deep learning)...

Oct
15
2018

Theoretical Machine Learning Seminar

On the Dynamics of Gradient Descent for Training Deep Neural Networks
Wei Hu
12:15pm|White Levy Room

Deep learning builds upon the mysterious ability of gradient-based methods to solve related non-convex optimization problems. However, a complete theoretical understanding is missing even in the simpler setting of training a deep linear neural...

Oct
22
2018

Theoretical Machine Learning Seminar

Learning in Non-convex Games with an Optimization Oracle.
Alon Gonen
12:15pm|Princeton University, CS 302

We consider adversarial online learning in a non-convex setting under the assumption that the learner has an access to an offline optimization oracle. In the most general unstructured setting of prediction with expert advice, Hazan and Koren (2016)...

Nov
05
2018

Theoretical Machine Learning Seminar

Scalable natural gradient training of neural networks
12:15pm|Princeton University, CS 302

Natural gradient descent holds the potential to speed up training of neural networks by correcting for the problem geometry and achieving desirable invariance properties. I’ll present Kronecker-Factored Approximate Curvature (K-FAC), a scalable...

Nov
12
2018

Theoretical Machine Learning Seminar

Generalized Framework for Nonlinear Acceleration
Damien Scieur
12:15pm|White Levy Room

Nonlinear Acceleration Algorithms, such as BFGS, were widely used in optimization due to their impressive performance even for large scale problems. However, these methods present a non negligeable number of drawbacks, such as a strong lack of...

Nov
19
2018

Theoretical Machine Learning Seminar

Prediction with a Short Memory
Sham Kakade
12:15pm|White Levy Room

We consider the problem of predicting the next observation given a sequence of past observations, and consider the extent to which accurate prediction requires complex algorithms that explicitly leverage long-range dependencies. Perhaps surprisingly...

Nov
26
2018

Theoretical Machine Learning Seminar

A La Carte Embedding: Cheap but Effective Induction of Semantic Feature Vectors
Nikunj Saunshi
12:15pm|Princeton University, CS 302

Motivations like domain adaptation, transfer learning, and feature learning have fueled interest in inducing embeddings for rare or unseen words, n-grams, synsets, and other textual features. This paper introduces a la carte embedding, a simple and...

Dec
10
2018

Theoretical Machine Learning Seminar

On Expressiveness and Optimization in Deep Learning
12:15pm|White Levy Room

Understanding deep learning calls for addressing three fundamental questions: expressiveness, optimization and generalization. Expressiveness refers to the ability of compactly sized deep neural networks to represent functions capable of solving...

Feb
11
2019

Theoretical Machine Learning Seminar

Online Control with Adversarial Disturbances
Naman Agarwal
12:15pm|White Levy Room

We study the control of a linear dynamical system with adversarial disturbances (as opposed to statistical noise). The objective we consider is one of regret: we desire an online control procedure that can do nearly as well as that of a procedure...

Feb
13
2019

Theoretical Machine Learning Seminar

Rahul Kidambi
12:15pm|Princeton University, CS 302

The current era of large scale machine learning powered by Deep Learning methods has brought about tremendous advances, driven by the lightweight Stochastic Gradient Descent (SGD) method. Despite relying on a simple algorithmic primitive, this era...

Feb
18
2019

Theoretical Machine Learning Seminar

Curiosity, Intrinsic Motivation, and Provably Efficient Maximum Entropy Exploration
Karan Singh
12:15pm|Princeton University, CS 302

Suppose an agent is in an unknown Markov environment in the absence of a reward signal, what might we hope that an agent can efficiently learn to do? One natural, intrinsically defined, objective problem is for the agent to learn a policy which...

Mar
04
2019

Theoretical Machine Learning Seminar

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
Will Grathwohl
12:15pm|Princeton University, CS 302

A promising class of generative models maps points from a simple distribution to a complex distribution through an invertible neural network. Likelihood-based training of these models requires restricting their architectures to allow cheap...

Mar
06
2019

Theoretical Machine Learning Seminar

Exponentiated Gradient Meets Gradient Descent
Will Grathwohl
1:30pm|Princeton University, CS 302

The (stochastic) gradient descent and the multiplicative update method are probably the most popular algorithms in machine learning. We introduce and study a new regularization which provides a unification of the additive and multiplicative updates...

Mar
11
2019

Theoretical Machine Learning Seminar

A Theoretical Analysis of Contrastive Unsupervised Representation Learning
Orestis Plevrakis
1:15pm|White Levy Room

Recent empirical works have successfully used unlabeled data to learn feature representations that are broadly useful in downstream classification tasks. Several of these methods are reminiscent of the well-known word2vec embedding algorithm...

Apr
08
2019

Theoretical Machine Learning Seminar

Fast minimization of structured convex quartics
Brian Bullins
12:15pm|White Levy Room

Recent progress in optimization theory has shown how we may harness second-order, i.e. Hessian, information, for achieving faster rates for both convex and non-convex optimization problems. Given this observation, it is then natural to ask what sort...

Oct
02
2019

Theoretical Machine Learning Seminar

Rethinking Control
Elad Hazan
12:00pm|Dilworth Room

Linear dynamical systems are a continuous subclass of reinforcement learning models that are widely used in robotics, finance, engineering, and meteorology. Classical control, since the works of Kalman, has focused on dynamics with Gaussian i.i.d...

Oct
08
2019

Theoretical Machine Learning Seminar

Unsupervised Ensemble Learning
12:00pm|White-Levy

In various applications, one is given the advice or predictions of several classifiers of unknown reliability, over multiple questions or queries. This scenario is different from standard supervised learning where classifier accuracy can be assessed...

Oct
09
2019

Theoretical Machine Learning Seminar

Designing Fast and Robust Learning Algorithms
12:00pm|Dilworth Room

Most people interact with machine learning systems on a daily basis. Such interactions often happen in strategic environments where people have incentives to manipulate the learning algorithms. As machine learning plays a more prominent role in our...

Oct
23
2019

Theoretical Machine Learning Seminar

Optimization Landscape and Two-Layer Neural Networks
12:00pm|Dilworth Room

Modern machine learning often optimizes a nonconvex objective using simple algorithm such as gradient descent. One way of explaining the success of such simple algorithms is by analyzing the optimization landscape and show that all local minima are...