2019-2020 Seminars

Feb
11
2020

Theoretical Machine Learning Seminar

Geometric Insights into the convergence of Non-linear TD Learning
12:00pm|Dilworth Room

While there are convergence guarantees for temporal difference (TD) learning when using linear function approximators, the situation for nonlinear models is far less understood, and divergent examples are known. We take a first step towards...

Feb
11
2020

Computer Science/Discrete Mathematics Seminar II

Proofs, Circuits, Communication, and Lower Bounds in Complexity Theory
Robert Robere
10:30am|Simonyi Hall 101

Many of the central problems in computational complexity revolve around proving lower bounds on the amount of resources used in various computational models. In this talk we will continue our survey of the connections between three central models...

Feb
10
2020

Computer Science/Discrete Mathematics Seminar I

Paths and cycles in expanders
11:00am|Simonyi Hall 101

Expanders have grown to be one of the most central and studied notions in modern graph theory. It is thus only natural to research extremal properties of expanding graphs. In this talk we will adapt the following (rather relaxed) definition of...

Feb
06
2020

Special Computer Science/Discrete Mathematics Seminar

Explicit rigid matrices in P^NP via rectangular PCPs
Prahladh Harsha
2:00pm|Simonyi 101

A nxn matrix M over GF(2) is said to be (r,\delta)-rigid if every matrix M' within \delta n^2 Hamming distance from M has rank at least r. A long standing open problem is to construct explicit rigid matrices. In a recent remarkable result, Alman and...

Feb
06
2020

Theoretical Machine Learning Seminar - PCTS Seminar Series: Deep Learning for Physics

Understanding Machine Learning via Exactly Solvable Statistical Physics Models
Lenka Zdeborova
11:45am|Jadwin Hall PCTS Seminar Room 407 (Princeton University)

Please Note: The seminars are not open to the general public, but only to active researchers.

Register here for this event: https://docs.google.com/forms/d/e/1FAIpQLScJ-BUVgJod6NGrreI26pedg8wGEyPhh3WMDskE1hIac_Yp3Q/viewform

The affinity between...

Feb
04
2020

Theoretical Machine Learning Seminar

Algorithm and Hardness for Kernel Matrices in Numerical Linear Algebra and Machine Learning
12:00pm|Dilworth Room

For a function K : R^d x R^d -> R, and a set P = {x_1, ..., x_n} in d-dimension, the K graph G_P of P is the complete graph on n nodes where the weight between nodes i and j is given by K(x_i, x_j). In this paper, we initiate the study of when...

Feb
04
2020

Computer Science/Discrete Mathematics Seminar II

Proofs, Circuits, Communication, and Lower Bounds in Complexity Theory
10:30am|Simonyi Hall 101

Many of the central problems in computational complexity revolve around proving lower bounds on the amount of resources used in various computational models. In this series of talks, we will discuss three standard objects in computational complexity...

Feb
03
2020

Computer Science/Discrete Mathematics Seminar I

MIP* = RE
Henry Yuen
11:00am|Simonyi Hall 101

MIP* (pronounced “M-I-P star”) denotes the class of problems that admit interactive proofs with quantum entangled provers. It has been an outstanding question to characterize the complexity of MIP*. Most notably, there was no known computable upper...

Jan
28
2020

Theoretical Machine Learning Seminar

What Noisy Convex Quadratics Tell Us about Neural Net Training
12:00pm|Dilworth Room

I’ll discuss the Noisy Quadratic Model, the toy problem of minimizing a convex quadratic function with noisy gradient observations. While the NQM is simple enough to have closed-form dynamics for a variety of optimizers, it gives a surprising amount...