BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//18a41c55e5f61d2a42b10cd7e21d5df0.ias.edu//NONSGML kigkonsult.se i
Calcreator 2.29.14//
CALSCALE:GREGORIAN
UID:d3a745f7-a8a4-400f-904e-50d465ffe54d
X-WR-TIMEZONE:UTC
X-WR-CALNAME:IAS Mathematics
BEGIN:VEVENT
UID:b359c4d6-f44d-48b8-9cee-0088a0740ef0
DTSTAMP:20200401T030202Z
CREATED:20200107T195151Z
DESCRIPTION:Topic: Weak solutions to the Navier--Stokes inequality with arb
itrary energy profiles\n\nSpeaker: Wojciech Ożański \, University of South
ern California\; Member\, School of Mathematics\n\nIn the talk we will foc
us on certain constructions of weak solutions to the Navier--Stokes inequa
lity (NSI)\, \[ u \cdot \left( u_t - \nu \Delta + (u\cdot \nabla ) u+ \nab
la p \right) \leq 0\] on $\mathbb R^3$. Such vector fields satisfy both th
e strong energy inequality and the local energy inequality (but not necess
arily solve the Navier--Stokes equations). Given $T>0$ and a nonincreasing
energy profile $e : [0\,T] \to [0\,\infty )$ we will construct a weak sol
ution to the NSI that is localised in space and whose energy profile $\| u
(t)\|_{L^2 (\mathbb R^3 )}$ stays arbitrarily close to $e(t)$ for all $t\i
n [0\,T]$.\nThe relevance of such solutions is that\, despite not satisfyi
ng the Navier--Stokes equations\, they do satisfy the partial regularity t
heory of Caffarelli\, Kohn & Nirenberg (Comm. Pure Appl. Math.\, 1982). In
fact\, Scheffer's constructions of weak solutions to the Navier--Stokes i
nequality (Comm. Math. Phys.\, 1985 & 1987) admit a finite-time blow-up on
a Cantor set\, which shows that the Caffarelli\, Kohn & Nirenberg theory
is sharp for such solutions. We will discuss the main ideas of his constru
ctions\, as well as present a stronger result: a construction of weak solu
tions to the NSI which follow a prescribed energy profile at the times lea
ding to the blow-up on the Cantor set.
DTSTART:20200113T220000Z
DTEND:20200113T230000Z
LAST-MODIFIED:20200110T153010Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis Seminar
URL:https://www.ias.edu/node/107631
END:VEVENT
BEGIN:VEVENT
UID:96ea417d-f84f-440e-890d-e6e3749b2013
DTSTAMP:20200401T030202Z
CREATED:20191203T181501Z
DESCRIPTION:Topic: Compositional inductive biases in human function learnin
g\n\nSpeaker: Samuel J. Gershman\, Harvard University\n\nVideo: https://vi
deo.ias.edu/MachineLearning/2020/0114-SamuelJGershman\n\nThis talk present
s evidence that humans learn complex functions by harnessing compositional
ity: complex structure is decomposed into simpler building blocks. I forma
lize this idea in the framework of Bayesian nonparametric regression using
a grammar over Gaussian process kernels\, and compare this approach with
other structure learning approaches. People consistently chose composition
al (over non-compositional) extrapolations and interpolations of functions
. Experiments designed to elicit priors over functional patterns revealed
an inductive bias for compositional structure. Compositional functions are
perceived as subjectively more predictable than noncompositional function
s\, and exhibited other signatures of predictability\, such as enhanced me
morability. Taken together\, these results support the view that the human
intuitive theory of functions is inherently compositional.
DTSTART:20200114T170000Z
DTEND:20200114T183000Z
LAST-MODIFIED:20200226T220133Z
LOCATION:Dilworth Room
SUMMARY:IAS-PNI Seminar on ML and Neuroscience
URL:https://www.ias.edu/node/106891
END:VEVENT
BEGIN:VEVENT
UID:3e603493-70b6-4205-87a3-0ddbdd93482e
DTSTAMP:20200401T030202Z
CREATED:20200110T162845Z
DESCRIPTION:Topic: Hypocoercivity\n\nSpeaker: George Deligiannidis\, Univer
sity of Oxford\n\nI will talk about an approach to proving exponential mix
ing for some kinetic\, non-diffusive stochastic processes\, that have rece
ntly become popular in computational statistics community.
DTSTART:20200115T230000Z
DTEND:20200116T003000Z
LAST-MODIFIED:20200110T162950Z
LOCATION:Dilworth Room
SUMMARY:Mathematical Conversations
URL:https://www.ias.edu/node/107716
END:VEVENT
BEGIN:VEVENT
UID:39a17f1a-5813-4353-a677-fe434bce9422
DTSTAMP:20200401T030202Z
CREATED:20200115T151835Z
DESCRIPTION:
DTSTART:20200116T150000Z
DTEND:20200116T170000Z
LAST-MODIFIED:20200115T151835Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/107876
END:VEVENT
BEGIN:VEVENT
UID:7e63e77a-e2de-4b93-b4f9-cc776fc238b6
DTSTAMP:20200401T030202Z
CREATED:20200107T145412Z
DESCRIPTION:Topic: Foundations of Intelligent Systems with (Deep) Function
Approximators\n\nSpeaker: Simon Du\, Member\, School of Mathematics\n\nFun
ction approximators\, like deep neural networks\, play a crucial role in b
uilding machine-learning based intelligent systems. This talk covers three
core problems of function approximators: understanding function approxima
tors\, designing new function approximators\, and applying function approx
imators.In the first part of the talk\, I will talk about understanding de
ep neural networks. I will show that the over-parameterized neural network
is equivalent to a new type of kernel\, Neural Tangent Kernel. Using this
equivalence\, we show that gradient descent provably finds the global opt
imum when optimizing an over-parameterized neural network\, and the learne
d neural network also generalizes well. In the second part of the talk\, I
will focus on designing new function approximators with strong empirical
performance. We transform (fully-connected\, graph\, convolutional) neural
networks to (fully-connected\, graph\, convolutional) Neural Tangent Kern
els which achieve superior performance on standard benchmarks. In the last
part of the talk\, I will focus on applying function approximation in the
reinforcement learning setting. I will give an exponential sample complex
ity lower bound of using function approximation to model the optimal polic
y. Then I will discuss what additional structures that admit statistically
efficient algorithms.
DTSTART:20200116T170000Z
DTEND:20200116T183000Z
LAST-MODIFIED:20200109T164907Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/107466
END:VEVENT
BEGIN:VEVENT
UID:32f6dcf2-bd60-4a88-b770-1dd0fc60c6ca
DTSTAMP:20200401T030202Z
CREATED:20200103T193001Z
DESCRIPTION:Topic: Inverse problems for quantum graphs\n\nSpeaker: Pavel Ku
rasov\, Stockholm University\n\nTo solve the inverse spectral problem for
the Schrödinger equation on a metric graph one needs to determine:\n• the
metric graph\;\n• the potential in the Schrödinger equation\;\n• the verte
x conditions (connecting the edges together).\nThe inverse problem is solv
ed completely in the case of trees under mild restrictions on the vertex c
onditions. The main tool is a combination of the boundary control and M-fu
nction approaches to inverse problems. These two approaches are essentiall
y equivalent in the case of single interval\, but their different features
may be effectively exploited to solve different partial inverse problems
for trees. The bunch cutting procedure allows one to reduce the tree step-
by-step by removing edges and vertices close to the boundary.\nTo solve th
e inverse problem for graphs with cycles we propose to use magnetic bounda
ry control and magnetic M-functions where spectral data for a fixed potent
ial are considered as functions of the magnetic fluxes through graph cycle
s. To solve the inverse problem we use cycle opening procedure mapping spe
ctral data for arbitrary graphs with cycles to spectral data for trees on
the same edge set. The graph and potential are reconstructed assuming so f
ar standard vertex conditions.
DTSTART:20200117T203000Z
DTEND:20200117T213000Z
LAST-MODIFIED:20200226T220233Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis - Mathematical Physics
URL:https://www.ias.edu/node/107406
END:VEVENT
BEGIN:VEVENT
UID:c96fbd12-148f-497d-ab1f-fe09461aa4de
DTSTAMP:20200401T030202Z
CREATED:20191122T190002Z
DESCRIPTION:Topic: Approximating CSPs on expanding structures\, and applica
tions to codes\n\nSpeaker: Madhur Tulsiani\, Toyota Technological Institut
e at Chicago\n\nVideo: https://video.ias.edu/csdm/2020/0121-MadhurTulsiani
\n\nI will discuss some recent results showing that the sum-of-squares SDP
hierarchy can be used to find approximately optimal solutions to k-CSPs\,
provided that the instance satisfies certain expansion properties. These
properties can be shown to follow from (k-1)-dimensional spectral expansio
n\, but are in fact weaker and also present (for example) in instances whe
re each constraint corresponds to a length-k walk in an expanding graph. I
will also discuss applications to unique and list decoding of direct sum
and direct product codes.\n\nBased on joint works with Vedat Levi Alev\, F
ernando Granha Jeronimo\, Dylan Quintana and Shashank Shrivastava.
DTSTART:20200121T153000Z
DTEND:20200121T173000Z
LAST-MODIFIED:20200226T220428Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/106571
END:VEVENT
BEGIN:VEVENT
UID:8830dab2-9a89-4f78-8402-f6c1c881b226
DTSTAMP:20200401T030202Z
CREATED:20191203T181501Z
DESCRIPTION:Topic: The Blessings of Multiple Causes\n\nSpeaker: David M. Bl
ei\, Columbia University\n\nVideo: https://video.ias.edu/machinelearning/2
020/01/21-DavidBlei\n\nCausal inference from observational data is a vital
problem\, but it\ncomes with strong assumptions. Most methods require tha
t we observe\nall confounders\, variables that affect both the causal vari
ables and\nthe outcome variables. But whether we have observed all confoun
ders is\na famously untestable assumption. We describe the deconfounder\,
a way\nto do causal inference with weaker assumptions than the classical\n
methods require.\n\nHow does the deconfounder work? While traditional caus
al methods\nmeasure the effect of a single cause on an outcome\, many mode
rn\nscientific studies involve multiple causes\, different variables whose
\neffects are simultaneously of interest. The deconfounder uses the\ncorre
lation among multiple causes as evidence for unobserved\nconfounders\, com
bining unsupervised machine learning and predictive\nmodel checking to per
form causal inference. We demonstrate the\ndeconfounder on real-world data
and simulation studies\, and describe\nthe theoretical requirements for t
he deconfounder to provide unbiased\ncausal estimates.\n\nThis is joint wo
rk with Yixin Wang.
DTSTART:20200121T170000Z
DTEND:20200121T183000Z
LAST-MODIFIED:20200226T220331Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106896
END:VEVENT
BEGIN:VEVENT
UID:1e2485de-777c-45ca-b588-ad5f021411d2
DTSTAMP:20200401T030202Z
CREATED:20200108T185801Z
DESCRIPTION:Topic: What is a Motive?\n\nSpeaker: Pierre Deligne\, Professor
Emeritus\, School of Mathematics\n\nVideo: https://video.ias.edu/mathconv
ersations/2020/0122-PierreDeligne\n\nI will explain the source of Grothend
ieck philosophy of motives\, and tell of applications.
DTSTART:20200122T230000Z
DTEND:20200123T003000Z
LAST-MODIFIED:20200226T220518Z
LOCATION:Dilworth Room
SUMMARY:Mathematical Conversations
URL:https://www.ias.edu/node/107666
END:VEVENT
BEGIN:VEVENT
UID:91bf0d52-d071-4f56-adbf-1bce71c9a462
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:
DTSTART:20200123T150000Z
DTEND:20200123T170000Z
LAST-MODIFIED:20191007T174501Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101076
END:VEVENT
BEGIN:VEVENT
UID:79e28e87-46e4-4651-9765-fa5fb448624e
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:Topic: In Defense of Uniform Convergence: Generalization via De
randomization\n\nSpeaker: Daniel M. Roy\, University of Toronto\; Member\,
School of Mathematics\n\nVideo: https://video.ias.edu/machinelearning/202
0/0123-DanielRoy\n\n
DTSTART:20200123T170000Z
DTEND:20200123T183000Z
LAST-MODIFIED:20200226T220739Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/105961
END:VEVENT
BEGIN:VEVENT
UID:fbd0ceea-2d7f-4bf6-a3e3-16a52cc1782b
DTSTAMP:20200401T030202Z
CREATED:20190607T212006Z
DESCRIPTION:Topic: Motivic Euler products in motivic statistics\n\nSpeaker:
Margaret Bilu\, New York University\n\nThe Grothendieck group of varietie
s over a field k is the quotient of the free abelian group of isomorphism
classes of varieties over k by the so-called cut-and-paste relations. It m
oreover has a ring structure coming from the product of varieties. Many pr
oblems in number theory have a natural\, more geometric counterpart involv
ing elements of this ring. Thus\, Poonen's Bertini theorem over finite fie
lds has a motivic analog due to Vakil and Wood\, which expresses the motiv
ic density of smooth hypersurface sections as the degree goes to infinity
in terms of a special value of Kapranov's zeta function. I will report on
joint work with Sean Howe\, providing a broad generalization of Vakil and
Wood's result\, which implies in particular a motivic analog of Poonen's B
ertini theorem with Taylor conditions\, as well as motivic analogs of many
generalizations and variants of Poonen's theorem. A key ingredient for th
is is a notion of motivic Euler product which allows us to write down cand
idate motivic probabilities.
DTSTART:20200123T213000Z
DTEND:20200123T223000Z
LAST-MODIFIED:20200117T145545Z
LOCATION:Princeton University\, Fine Hall 214
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/101311
END:VEVENT
BEGIN:VEVENT
UID:be49d724-77f1-4477-a0c3-8139dff5ea6f
DTSTAMP:20200401T030202Z
CREATED:20190607T202005Z
DESCRIPTION:Topic: Equality Alone Does not Simulate Randomness\n\nSpeaker:
Marc Vinyals\, Technion\n\nVideo: https://video.ias.edu/csdm/2020/0127-Mar
cVinyals\n\nRandomness can provide an exponential saving in the amount of
communication needed to solve a distributed problem\, and the canonical ex
ample of this is the equality function. However\, in many examples where r
andomness helps\, having an efficient way to do hashing would be enough to
solve the problem efficiently. Is hashing all there is to randomness?\n\n
In this talk we show that hashing is not enough. More precisely\, we exhib
it a function that can be solved efficiently using randomized protocols bu
t not if we only allow access to an oracle that computes the equality func
tion\, which models hashing.\n\nJoint work with Arkadev Chattopadhyay and
Shachar Lovett.
DTSTART:20200127T160000Z
DTEND:20200127T170000Z
LAST-MODIFIED:20200226T221027Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/100631
END:VEVENT
BEGIN:VEVENT
UID:0fcfe83c-21e1-4ffa-afd8-5ba2e74679b6
DTSTAMP:20200401T030202Z
CREATED:20170413T190001Z
DESCRIPTION:Topic: Knotted 3-balls in the 4-sphere\n\nSpeaker: David Gabai\
, Princeton University\n\nVideo: https://video.ias.edu/members/2020/0127-D
avidGabai\n\nWe give the first examples of codimension-1 knotting in the 4
-sphere\, i.e. there is a 3-ball B1 with boundary the standard linear 2-sp
here\, which is not isotopic rel boundary to the standard linear 3-ball B0
. Actually\, there is an infinite family of distinct isotopy classes of su
ch balls. This implies that there exist inequivalent fiberings of the unkn
ot in 4-sphere\, in contrast to the situation in dimension-3. Also\, that
there exists diffeomorphisms of S1 x B3 homotopic rel boundary to the iden
tity\, but not isotopic rel boundary to the identity.\n\nJoint work with R
yan Budney
DTSTART:20200127T190000Z
DTEND:20200127T200000Z
LAST-MODIFIED:20200226T220933Z
LOCATION:Simonyi Hall 101
SUMMARY:Members' Seminar
URL:https://www.ias.edu/node/73406
END:VEVENT
BEGIN:VEVENT
UID:c9ba42b3-a951-4cd3-b67e-32415390b889
DTSTAMP:20200401T030202Z
CREATED:20190607T210002Z
DESCRIPTION:Topic: Symplectic embeddings\, integrable systems and billiards
\n\nSpeaker: Vinicius Ramos\, Instituto Nacional de Matemática Pura e Apli
cada\n\nVideo: https://video.ias.edu/sympdyngeo/2020/0127-ViniciusRamos\n
\nSymplectic embedding problems are at the core of symplectic topology. Ma
ny results have been found involving balls\, ellipsoids and polydisks. Mor
e recently\, there has been progress on problems involving lagrangian prod
ucts and related domains. In this talk\, I explain what is known about sym
plectic embeddings of these domains. There are rigid and flexible phenomen
a and for some problems\, the transition between the two happen at a surpr
ising place. In order to get to the results\, we will use the Arnold-Liouv
ille theorem and billiard dynamics.
DTSTART:20200127T203000Z
DTEND:20200127T213000Z
LAST-MODIFIED:20200226T220856Z
LOCATION:Simonyi Hall 101
SUMMARY:Symplectic Dynamics/Geometry Seminar
URL:https://www.ias.edu/node/100866
END:VEVENT
BEGIN:VEVENT
UID:5c6b960e-6e52-4abc-b309-5802c50badae
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: Pseudo-deterministic algorithms\n\nSpeaker: Toniann Pita
ssi\, University of Toronto\; Visiting Professor\, School of Mathematics\n
\nA pseudodeterministic algorithm for a search problem (introduced by Gold
wasser and Gat) is a randomized algorithm that must output the *same* corr
ect answer with high probability over all choices of randomness. In this t
alk I will give several motivating examples of search problems that illust
rate the importance of the notion\, and connections with other well-studie
d topics in TCS (i.e.\, explicit constructions of combinatorial objects kn
own to exist by the probabilistic method\, lower bounds for random kSAT\,
inapproximability lower bounds for SOS based algorithms\, and random Resol
ution. After reviewing some known results\, we will prove a Pseudodetermin
istic query lower bound for the (complete) problem of finding a '1' in a v
ector that is promised to have at least a constant fraction of 1's. Our pr
oof uses Huang's Sensitivity Theorem as well as Nullstellensatz/SOS degree
lower bounds and brings up a number of open problems.\nThis work is joint
with Shafi Goldwasser\, Russell Impagliazzo and Rahul Santhanam.
DTSTART:20200128T153000Z
DTEND:20200128T173000Z
LAST-MODIFIED:20200127T143507Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100731
END:VEVENT
BEGIN:VEVENT
UID:96390a75-744e-4253-b20c-4597d896a40e
DTSTAMP:20200401T030202Z
CREATED:20191203T181501Z
DESCRIPTION:Topic: What Noisy Convex Quadratics Tell Us about Neural Net Tr
aining\n\nSpeaker: Roger Grosse\, University of Toronto\; Member\, School
of Mathematics\n\nI’ll discuss the Noisy Quadratic Model\, the toy problem
of minimizing a convex quadratic function with noisy gradient observation
s. While the NQM is simple enough to have closed-form dynamics for a varie
ty of optimizers\, it gives a surprising amount of insight into neural net
training phenomena. First\, we’ll look at the problem of adapting learnin
g rates using meta-descent (i.e. differentiating through the training dyna
mics). The NQM illuminates why short-horizon meta-descent objectives vastl
y underestimate the optimal learning rate. Second\, we’ll study how the be
havior of various neural net optimizers depends on the batch size. When is
it beneficial to use preconditioning? Momentum? Parameter averaging? Lear
ning rate schedules? The NQM can generate predictions in seconds which see
m to capture the qualitative behavior of large-scale classification conv n
ets and transformers.
DTSTART:20200128T170000Z
DTEND:20200128T183000Z
LAST-MODIFIED:20200123T155625Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106906
END:VEVENT
BEGIN:VEVENT
UID:2b11411f-e789-4b9f-8f47-892b91db72c5
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:
DTSTART:20200130T150000Z
DTEND:20200130T170000Z
LAST-MODIFIED:20191007T174501Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101081
END:VEVENT
BEGIN:VEVENT
UID:55ebe469-af7c-4ac3-a5e0-de9a63dd32d2
DTSTAMP:20200401T030202Z
CREATED:20190607T212006Z
DESCRIPTION:Topic: Eisenstein series and the cubic moment for PGL(2)\n\nSpe
aker: Paul Nelson\, ETH Zürich\n\nVideo: https://video.ias.edu/puias/2020/
0130-PaulNelson\n\nWe will discuss how to study the cubic moment of any fa
mily of automorphic L-functions on PGL(2) using regularized diagonal perio
ds of Eisenstein series\, following a strategy suggested by Michel--Venkat
esh. Applications include generalizations to the setting of number fields
of some results of Conrey--Iwaniec and Petrow--Young\, improved estimates
for representation numbers of ternary quadratic forms over number fields\,
and improvements to the prime geodesic theorem on arithmetic hyperbolic 3
-folds.
DTSTART:20200130T213000Z
DTEND:20200130T223000Z
LAST-MODIFIED:20200226T221145Z
LOCATION:Simonyi Hall 101
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/101316
END:VEVENT
BEGIN:VEVENT
UID:2030ac12-fc69-4ca3-98b7-fcdb663e4a9f
DTSTAMP:20200401T030202Z
CREATED:20190607T202005Z
DESCRIPTION:Topic: MIP* = RE \n\nSpeaker: Henry Yuen\, University of Toront
o\n\nVideo: https://video.ias.edu/csdm/2020/0203-HenryYuen\n\nMIP* (pronou
nced “M-I-P star”) denotes the class of problems that admit interactive pr
oofs with quantum entangled provers. It has been an outstanding question t
o characterize the complexity of MIP*. Most notably\, there was no known c
omputable upper bound on MIP*.\n\nWe show that MIP* is equal to the class
RE\, the set of recursively enumerable languages. In particular\, this sho
ws that MIP* contains uncomputable problems. Through a series of known con
nections\, this also yields a negative answer to Connes’ Embedding Problem
from operator algebras. In this talk\, I will explain the connection betw
een Connes' Embedding Problem\, quantum information theory\, and computati
onal complexity. I will then give an overview of our approach\, which invo
lves reducing the Halting Problem to the problem of approximating the enta
ngled value of nonlocal games.\n\nJoint work with Zhengfeng Ji\, Anand Nat
arajan\, Thomas Vidick\, and John Wright.
DTSTART:20200203T160000Z
DTEND:20200203T170000Z
LAST-MODIFIED:20200226T221536Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/100636
END:VEVENT
BEGIN:VEVENT
UID:6e2f798f-7209-4f09-9d6d-28e77852abb9
DTSTAMP:20200401T030202Z
CREATED:20170413T190001Z
DESCRIPTION:Topic: Coarse dynamics and partially hyperbolic diffeomorphisms
in 3-manifolds\n\nSpeaker: Rafael Potrie\, Universidad de la República\,
Uruguay\; von Neumann Fellow\, School of Mathematics\n\nVideo: https://vid
eo.ias.edu/members/2020/0203-RafaelPotrie\n\nThe purpose of this talk is t
o introduce the classification problem of partially hyperbolic diffeomorph
isms in dimension 3 (including introducing the concept of partially hyperb
olic diffeomorphisms and its relevance). The main goal will be to explain
the importance of the understanding of foliations in 3-manifolds as well a
s pseudo-Anosov dynamics and hyperbolic geometry to this task. The talk wi
ll touch on work by many people\, but focus on recent work joint with T. B
arthelme\, S. Fenley and S. Frankel.
DTSTART:20200203T190000Z
DTEND:20200203T200000Z
LAST-MODIFIED:20200226T221407Z
LOCATION:Simonyi Hall 101
SUMMARY:Members' Seminar
URL:https://www.ias.edu/node/73411
END:VEVENT
BEGIN:VEVENT
UID:459d9b79-28ee-4205-9b37-d9922b136f4e
DTSTAMP:20200401T030202Z
CREATED:20190607T210002Z
DESCRIPTION:Topic: Counting embedded curves in symplectic 6-manifolds\n\nSp
eaker: Aleksander Doan\, Columbia University\n\nVideo: https://video.ias.e
du/sympdyngeo/2020/0203-AleksanderDoan\n\nThe number of embedded pseudo-ho
lomorphic curves in a symplectic manifold typically depends on the choice
of an almost complex structure on the manifold and so does not lead to a s
ymplectic invariant. However\, I will discuss two instances in which such
naive counting does define a symplectic invariant\, which turns out to be
related to the Gopakumar-Vafa conjecture inspired by string theory. The ta
lk is based on joint work with Thomas Walpuski.
DTSTART:20200203T203000Z
DTEND:20200203T213000Z
LAST-MODIFIED:20200226T221329Z
LOCATION:Simonyi Hall 101
SUMMARY:Symplectic Dynamics/Geometry Seminar
URL:https://www.ias.edu/node/100871
END:VEVENT
BEGIN:VEVENT
UID:68e80a0f-cfa6-481f-a185-0bcc6b451f6d
DTSTAMP:20200401T030202Z
CREATED:20190607T213002Z
DESCRIPTION:Topic: When do interacting organisms gravitate to the vertices
of a regular simplex?\n\nSpeaker: Robert McCann\, University of Toronto\n
\nVideo: https://video.ias.edu/analysis/2020/0203-RobertMcCann\n\nFlocking
and swarming models which seek to explain pattern formation in mathematic
al biology often assume that organisms interact through a force which is a
ttractive over large distances yet repulsive at short distances. Suppose t
his force is given as a difference of power laws and normalized so that it
s unique minimum occurs at unit separation. For a range of exponents corre
sponding to mild repulsion and strong attraction\, we show that the minimu
m energy configuration is uniquely attained - apart from translations and
rotations - by equidistributing the organisms over the vertices of a regul
ar top-dimensional simplex (i.e. an equilateral triangle in two dimensions
and regular tetrahedron in three). If the attraction is not assumed to be
strong\, we show these configurations are at least local energy minimizer
s in the relevant d1 metric from optimal transportation\, as are all of th
e other un-countably many unbalanced configurations with the same support.
These therefore form stable attractors for the associated first- and seco
nd-order dynamics. We infer the existence of phase transitions. An ingredi
ent from the proof with independent interest is the establishment of a sim
ple isodiametric variance bound which characterizes regular simplices: it
shows that among probability measures on Rn whose supports have at most un
it diameter\, the variance around the mean is maximized precisely by those
measures which assign mass 1/(n + 1) to each vertex of a (unit-diameter)
regular simplex.
DTSTART:20200203T220000Z
DTEND:20200203T230000Z
LAST-MODIFIED:20200226T221246Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis Seminar
URL:https://www.ias.edu/node/101436
END:VEVENT
BEGIN:VEVENT
UID:8f04bea9-15d0-4b71-8bbb-ec04a6a31777
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: Proofs\, Circuits\, Communication\, and Lower Bounds in
Complexity Theory\n\nSpeaker: Robert Robere\, Member\, School of Mathemati
cs\n\nVideo: https://video.ias.edu/csdm/2020/0204-RobertRobere\n\nMany of
the central problems in computational complexity revolve around proving lo
wer bounds on the amount of resources used in various computational models
. In this series of talks\, we will discuss three standard objects in comp
utational complexity --- propositional proofs\, boolean circuits\, and com
munication protocols --- and further describe a three-way connection betwe
en them. Much recent work has profitably exploited this connection (in the
context of so-called ''lifting theorems'') to translate lower bounds from
one model to the other\; we will explore several examples of such theorem
s and suggest some future directions.
DTSTART:20200204T153000Z
DTEND:20200204T173000Z
LAST-MODIFIED:20200226T221708Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100726
END:VEVENT
BEGIN:VEVENT
UID:6fb47922-e0cc-4679-814d-db7c26cb78d2
DTSTAMP:20200401T030202Z
CREATED:20191203T182005Z
DESCRIPTION:Topic: Algorithm and Hardness for Kernel Matrices in Numerical
Linear Algebra and Machine Learning\n\nSpeaker: Zhao Song\, Member\, Schoo
l of Mathematics\n\nVideo: https://video.ias.edu/machinelearning/2020/0204
-ZhaoSong\n\nFor a function K : R^d x R^d -> R\, and a set P = {x_1\, ...\
, x_n} in d-dimension\, the K graph G_P of P is the complete graph on n no
des where the weight between nodes i and j is given by K(x_i\, x_j). In th
is paper\, we initiate the study of when efficient spectral graph theory i
s possible on these graphs. We investigate whether or not it is possible t
o solve the following problems in n^{1+o(1)} time for a K-graph G_P when d
\n\n1. Multiply a given vector by the adjacency matrix or Laplacian matrix
of G_P\n\n2. Find a spectral sparsifier of G_P\n\n3. Solve a Laplacian sy
stem in G_P's Laplacian matrix\n\nFor each of these problems\, we consider
all functions of the form K(u\,v) = f( || u - v ||_2^2 ) for a function f
: R -> R. We provide algorithms and comparable hardness results for many
such K\, including Gaussian kernel\, Neural tangent kernels and so on. For
example\, in dimension d = Omega( log n )\, for each of these problems\,
we show that there is a parameter associated with the function f for which
low parameter values imply n^{1+o(1)} time algorithms\, and high paramete
r values imply the nonexistence of subquadratic time algorithms assuming S
trong Exponential Time Hypothesis\, given natural assumptions on f.\n\nAs
part of our results\, we also show that exponential dependence on the dime
nsion d in the celebrated fast multipole method of Greengard and Rokhlin c
annot be improved\, assuming Strong Exponential Time Hypothesis\, for a br
oad class of functions f. To the best of our knowledge\, this is the first
formal limitation proven about fast multipole methods (which is one of th
e top-10 algorithms in 20th century).\n\nJoint work with Josh Alman (Harva
rd University)\, Timothy Chu (Carnegie Mellon University)\, and Aaron Schi
ld (University of Washington)
DTSTART:20200204T170000Z
DTEND:20200204T183000Z
LAST-MODIFIED:20200226T221621Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106911
END:VEVENT
BEGIN:VEVENT
UID:09053bb5-b666-4155-bcce-b769e05e1954
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:Topic: Anosov flows in 3-manifolds and the fundamental group\n
\nSpeaker: Rafael Potrie\, Universidad de la República\, Uruguay\; von Neu
mann Fellow\, School of Mathematics\n\nThe goal of the talk is to explain
the statement and proof of a beautiful result due to Margulis (1967) later
extended by Plante and Thurston (1972) that imposes restrictions on the g
rowth of the fundamental group of 3-manifolds that support Anosov flows.
DTSTART:20200205T230000Z
DTEND:20200206T003000Z
LAST-MODIFIED:20200130T192144Z
LOCATION:Dilworth Room
SUMMARY:Mathematical Conversations
URL:https://www.ias.edu/node/100976
END:VEVENT
BEGIN:VEVENT
UID:4f189f24-3da4-4a31-98c0-ab89d5f8b615
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:
DTSTART:20200206T150000Z
DTEND:20200206T170000Z
LAST-MODIFIED:20191007T174501Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101086
END:VEVENT
BEGIN:VEVENT
UID:1b4907ed-3191-41d1-b5a3-21164910c238
DTSTAMP:20200401T030202Z
CREATED:20191021T182005Z
DESCRIPTION:Topic: Understanding Machine Learning via Exactly Solvable Stat
istical Physics Models\n\nSpeaker: Lenka Zdeborova\, CEA Scalay and CNRS S
peaker\n\nPlease Note: The seminars are not open to the general public\, b
ut only to active researchers.\n\nRegister here for this event: https://do
cs.google.com/forms/d/e/1FAIpQLScJ-BUVgJod6NGrreI26pedg8wGEyPhh3WMDskE1hIa
c_Yp3Q/viewform\n\nThe affinity between statistical physics and machine le
arning has long history\, this is reflected even in the machine learning t
erminology that is in part adopted from physics. I will describe the main
lines of this long-lasting friendship in the context of current theoretica
l challenges and open questions about deep learning. Theoretical physics o
ften proceeds in terms of solvable synthetic models\, I will describe the
related line of work on solvable models of simple feed-forward neural netw
orks. I will highlight a path forward to capture the subtle interplay betw
een the structure of the data\, the architecture of the network\, and the
learning algorithm.
DTSTART:20200206T164500Z
DTEND:20200206T173000Z
LAST-MODIFIED:20200130T200722Z
LOCATION:Jadwin Hall PCTS Seminar Room 407 (Princeton University)
SUMMARY:Seminar on Theoretical Machine Learning - PCTS Seminar Series: Deep
Learning for Physics
URL:https://www.ias.edu/node/105746
END:VEVENT
BEGIN:VEVENT
UID:34545ae1-974d-41ed-8eaa-1a541d89839e
DTSTAMP:20200401T030202Z
CREATED:20200128T143445Z
DESCRIPTION:Topic: Explicit rigid matrices in P^NP via rectangular PCPs\n\n
Speaker: Prahladh Harsha\, Tata Institute of Fundamental Research\n\nVideo
: https://video.ias.edu/csdm/2020/0206-PrahladhHarsha\n\nA nxn matrix M ov
er GF(2) is said to be (r\,\delta)-rigid if every matrix M' within \delta
n^2 Hamming distance from M has rank at least r. A long standing open prob
lem is to construct explicit rigid matrices. In a recent remarkable result
\, Alman and Chen gave explicit constructions of rigid matrices using an N
P oracle. In this talk\, we will give a simplified and improved constructi
on of their result that proves explicit (r\,0.01)-rigid matrices in P^NP w
here 2 = 2^{(log n)^{1-o(1)}}. We obtain our improvement by considering PC
Ps where the query and randomness satisfy a certain 'rectangular property'
.\n\nJoint work with Amey Bhangale\, Orr Paradise and Avishay Tal
DTSTART:20200206T190000Z
DTEND:20200206T200000Z
LAST-MODIFIED:20200226T221754Z
LOCATION:Simonyi 101
SUMMARY:Computer Science/Discrete Mathematics - Special Seminar
URL:https://www.ias.edu/node/108236
END:VEVENT
BEGIN:VEVENT
UID:559d04bd-3882-45a9-b30f-875476ef611d
DTSTAMP:20200401T030202Z
CREATED:20200130T195606Z
DESCRIPTION:Topic: Dynamics of Generalization in Overparameterized Neural N
etworks\n\nSpeaker: Andrew Saxe\, Oxford University\n\nPlease Note: The se
minars are not open to the general public\, but only to active researchers
.\n\nRegister here for this event: https://docs.google.com/forms/d/e/1FAIp
QLScJ-BUVgJod6NGrreI26pedg8wGEyPhh3WMDskE1hIac_Yp3Q/viewform\n\nWhat inter
play of dynamics\, architecture\, and data make good generalization possib
le in overparameterized neural networks? Approaches from statistical physi
cs have shed light on this question by considering a variety of simple lim
iting cases. I will describe results emerging from two simple models: deep
linear neural networks and nonlinear student-teacher networks. In these m
odels\, good generalization from limited data arises from aspects of train
ing dynamics and initialization. Finally\, I will briefly tour open proble
ms facing practitioners that seem amenable to analysis with similar method
s.
DTSTART:20200206T190000Z
DTEND:20200206T200000Z
LAST-MODIFIED:20200130T195928Z
LOCATION:Jadwin Hall PCTS Seminar Room 407 (Princeton University)
SUMMARY:Seminar on Theoretical Machine Learning - PCTS Seminar Series: Deep
Learning for Physics
URL:https://www.ias.edu/node/108606
END:VEVENT
BEGIN:VEVENT
UID:a386aa66-d380-469c-8de9-45ad92b16a41
DTSTAMP:20200401T030202Z
CREATED:20190607T212006Z
DESCRIPTION:Topic: Supersingular main conjectures\, Sylvester's conjecture
and Goldfeld's conjecture\n\nSpeaker: Daniel Kriz\, Massachusetts Institut
e of Technology\n\nIn this talk\, I formulate and prove a new Rubin-type I
wasawa main conjecture for imaginary quadratic fields in which p is inert
or ramified\, as well as a Perrin-Riou type Heegner point main conjecture
for certain supersingular CM elliptic curves. These main conjectures and t
heir proofs are related to p-adic L-functions that I have previously const
ructed\, and have applications to two classical problems of arithmetic. Fi
rst\, I prove the 1879 conjecture of Sylvester stating that if p = 4\,7\,8
mod 9\, then x^3 + y^3 = p has a solution with x\,y rational numbers. Sec
ond\, combined with previous Selmer distribution results\, I show that 100
% of squarefree d = 5\,6\,7 mod 8 are congruent numbers\, thus establishin
g Goldfeld's conjecture for the family y^2 = x^3 - d^2x\, and solving the
congruent number problem in 100% of cases.
DTSTART:20200206T213000Z
DTEND:20200206T223000Z
LAST-MODIFIED:20200102T162004Z
LOCATION:Princeton University\, Fine Hall 214
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/101321
END:VEVENT
BEGIN:VEVENT
UID:1a728f06-b2a3-472c-b906-41a8ac8af6ec
DTSTAMP:20200401T030202Z
CREATED:20190607T202005Z
DESCRIPTION:Topic: Paths and cycles in expanders\n\nSpeaker: Michael Krivel
evich\, Tel Aviv University\n\nVideo: https://video.ias.edu/csdm/2020/0210
-MichaelKrivelevich\n\nExpanders have grown to be one of the most central
and studied notions in modern graph theory. It is thus only natural to res
earch extremal properties of expanding graphs. In this talk we will adapt
the following (rather relaxed) definition of expanders. For a constant alp
ha>0\, a graph G on n vertices is called an alpha-expander if the external
neighborhood of every vertex subset U of size |U|
DTSTART:20200210T160000Z
DTEND:20200210T170000Z
LAST-MODIFIED:20200227T015358Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/100641
END:VEVENT
BEGIN:VEVENT
UID:aba2fc8a-b2e5-4a9a-b394-431aa7b6a093
DTSTAMP:20200401T030202Z
CREATED:20170413T190001Z
DESCRIPTION:Topic: Spectra of metric graphs and crystalline measures\n\nSpe
aker: Peter Sarnak\, Professor\, School of Mathematics\n\nVideo: https://v
ideo.ias.edu/members/2020/0210-PeterSarnak\n\nThe geometric optics trace f
ormula gives the singular support of wave trace on a compact Riemannian ma
nifold. In the case of of a one dimensional singular manifold\, that is a
metric (or quantum) graph\, this formula is exact and yields a crystalline
measure generalizing the Poisson and related Summation Formulae. We exami
ne the additive structure of the spectra of such metric graphs.The resulti
ng measures are exotic and resolve a number of problems about crystalline
measures.A key ingredient in the analysis is the diophantine theory of a t
orus (uniform versions of Conjectures of Lang and generalizations).\n\nJoi
nt work with P. Kurasov.
DTSTART:20200210T190000Z
DTEND:20200210T200000Z
LAST-MODIFIED:20200226T221841Z
LOCATION:Simonyi Hall 101
SUMMARY:Members' Seminar
URL:https://www.ias.edu/node/73416
END:VEVENT
BEGIN:VEVENT
UID:5358c55f-e511-4595-8ce6-ab0dd5d3e98a
DTSTAMP:20200401T030202Z
CREATED:20190607T210002Z
DESCRIPTION:Topic: Floer homotopy without spectra\n\nSpeaker: Mohammed Abou
zaid\, Columbia University\n\nI will explain a direct way for defining the
Floer homotopy groups of a (framed) manifold flow category in the sense o
f Cohen Jones and Segal\, which does not require any sophisticated tools f
rom homotopy theory (in particular\, the notion of a spectrum is not requi
red for the definition). The key point is to work on the geometric topolog
y side of the Pontryagin-Thom construction. Time permitting\, I will also
discuss various generalisations which are relevant to Floer theory\, and w
ell as joint work in progress with Blumberg for building a spectrum from t
he new point of view (should you feel that this is needed).
DTSTART:20200210T203000Z
DTEND:20200210T213000Z
LAST-MODIFIED:20200206T204829Z
LOCATION:Fine Hall 214\, Princeton University
SUMMARY:Symplectic Dynamics/Geometry Seminar
URL:https://www.ias.edu/node/100876
END:VEVENT
BEGIN:VEVENT
UID:45eb21f7-e584-430c-b942-9646c3037663
DTSTAMP:20200401T030202Z
CREATED:20190607T213002Z
DESCRIPTION:Topic: On dynamical spectral rigidity and determination\n\nSpea
ker: Jacopo De Simoi\, University of Toronto\n\nVideo: https://video.ias.e
du/analysis/2020/0210-JacopoDeSimoi\n\nGiven a planar domain with sufficie
ntly regular boundary\, one can study periodic orbits of the associated bi
lliard problem. Periodic orbits have a rich and quite intricate structure
and it is natural to ask how much information about the domain is encoded
in the set of lengths of such orbits. The quantum analog of this question
is the celebrated Laplace inverse problem\, or “Can one hear the shape of
a drum?' For a class of smooth convex domains we prove dynamical spectral
rigidity: in this class it is not possible to deform a domain without pert
urbing the length of at least one orbit. In a class of analytic dispersing
open billiards we show marked spectral determination: knowing all lengths
of all periodic orbit of such systems together with some combinatorial in
formation allows to completely reconstruct the domain. Such results are pa
rt of an ongoing joint project with V. Kaloshin and other collaborators (Q
. Wei\, M. Leguil and P. Bálint)
DTSTART:20200210T220000Z
DTEND:20200210T230000Z
LAST-MODIFIED:20200226T221928Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis Seminar
URL:https://www.ias.edu/node/101441
END:VEVENT
BEGIN:VEVENT
UID:e9c68b53-f653-4e8f-9e48-c5ca5bd56d86
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: Proofs\, Circuits\, Communication\, and Lower Bounds in
Complexity Theory\n\nSpeaker: Robert Robere \, Member\, School of Mathemat
ics\n\nVideo: https://video.ias.edu/csdm/2020/0211-RobertRobere\n\nMany of
the central problems in computational complexity revolve around proving l
ower bounds on the amount of resources used in various computational model
s. In this talk we will continue our survey of the connections between thr
ee central models (proofs\, communication\, circuits) from last week\, ske
tching several examples of the so-called 'lifting theorems' for proving lo
wer bounds.
DTSTART:20200211T153000Z
DTEND:20200211T173000Z
LAST-MODIFIED:20200227T015455Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100721
END:VEVENT
BEGIN:VEVENT
UID:e028445e-1b94-4711-b3c6-db7bd82512c9
DTSTAMP:20200401T030202Z
CREATED:20191203T182005Z
DESCRIPTION:Topic: Geometric Insights into the convergence of Non-linear TD
Learning\n\nSpeaker: Joan Bruna\, New York University\; Member\, School o
f Mathematics\n\nWhile there are convergence guarantees for temporal diffe
rence (TD) learning when using linear function approximators\, the situati
on for nonlinear models is far less understood\, and divergent examples ar
e known. We take a first step towards extending theoretical convergence gu
arantees to TD learning with nonlinear function approximation. More precis
ely\, we consider the expected learning dynamics of the TD(0) algorithm fo
r value estimation. As the step-size converges to zero\, these dynamics ar
e defined by a nonlinear ODE which depends on the geometry of the space of
function approximators\, the structure of the underlying Markov chain\, a
nd their interaction. We find a set of function approximators that include
s ReLU networks and has geometry amenable to TD learning regardless of env
ironment\, so that the solution performs about as well as linear TD in the
worst case. Then\, we show how environments that are more reversible indu
ce dynamics that are better for TD learning and prove global convergence t
o the true value function for well-conditioned function approximators. Fin
ally\, we generalize a divergent counterexample to a family of divergent p
roblems to demonstrate how the interaction between approximator and enviro
nment can go wrong and to motivate the assumptions needed to prove converg
ence.
DTSTART:20200211T170000Z
DTEND:20200211T183000Z
LAST-MODIFIED:20200205T135927Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106916
END:VEVENT
BEGIN:VEVENT
UID:10390c4c-cc7a-42e2-92a9-34c864a5a6c5
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:Topic: p-adic numbers in cryptography and Rocky Horror\n\nSpeak
er: Mark Goresky\, Visitor\, Institute for Advanced Study\n\nThis is a sha
meless repeat of a Math Conversations I gave about four years ago\, and ma
ybe four years before that as well\, explaining 2-adic shift registers.
DTSTART:20200212T230000Z
DTEND:20200213T003000Z
LAST-MODIFIED:20200207T134450Z
LOCATION:Dilworth Room
SUMMARY:Mathematical Conversations
URL:https://www.ias.edu/node/100981
END:VEVENT
BEGIN:VEVENT
UID:5ac5089b-0d76-4428-ae19-a55463324472
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:
DTSTART:20200213T150000Z
DTEND:20200213T170000Z
LAST-MODIFIED:20191007T174501Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101091
END:VEVENT
BEGIN:VEVENT
UID:ef66b80b-20cc-4771-9664-ac325c1bd190
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:Topic: The Lottery Ticket Hypothesis: On Sparse\, Trainable Neu
ral Networks\n\nSpeaker: Jonathan Frankle\, Massachusetts Institute of Tec
hnology\n\nWe recently proposed the 'Lottery Ticket Hypothesis\,' which co
njectures that the dense neural networks we typically train have much smal
ler subnetworks capable of training in isolation to the same accuracy star
ting from the original initialization. This hypothesis raises questions ab
out the nature of overparameterization and the importance of initializatio
n for training neural networks in practice. In this talk\, I will discuss
existing work and the latest developments on the 'Lottery Ticket Hypothesi
s\,' including the empirical evidence for these claims on small vision tas
ks\, changes necessary to scale these ideas to ImageNet\, and the relation
ship between these subnetworks and their 'stability' to the noise of stoch
astic gradient descent. This research is entirely empirical\, although it
has exciting implications for theory. (This is joint work with Gintare Kar
olina Dziugaite\, Daniel M. Roy\, Michael Carbin\, and Alex Renda.)
DTSTART:20200213T170000Z
DTEND:20200213T183000Z
LAST-MODIFIED:20200210T204847Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/105976
END:VEVENT
BEGIN:VEVENT
UID:35b6802a-6c74-487e-a883-10bf2d681243
DTSTAMP:20200401T030202Z
CREATED:20190607T212006Z
DESCRIPTION:Topic: Moduli spaces of shtukas over function fields\n\nSpeaker
: Jared Weinstein\, Boston University\n\nVideo: https://video.ias.edu/puia
s/2020/0213-JaredWeinstein\n\nWe present some work in progress\, on moduli
spaces of Drinfeld shtukas. These spaces are the function field analogous
to Shimura varieties. In fact they are more versatile\; there are r-legge
d versions for any r. Tate's conjecture predicts some interesting relation
s between shtuka spaces and function field arithmetic. For instance\, ther
e should be a notion of modularity for the r-fold product of an elliptic c
urve. We verify these predictions in a few cases. This is partly joint wor
k with Noam Elkies.
DTSTART:20200213T213000Z
DTEND:20200213T223000Z
LAST-MODIFIED:20200227T015556Z
LOCATION:Simonyi Hall 101
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/101326
END:VEVENT
BEGIN:VEVENT
UID:6fa27ad8-daac-43dc-b893-fbcab94133f0
DTSTAMP:20200401T030202Z
CREATED:20200113T155056Z
DESCRIPTION:Topic: Stable shock formation for the compressible Euler equati
ons\n\nSpeaker: Vlad Vicol\, New York University\n\nWe will discuss a rece
nt result with Tristan Buckmaster (Princeton) and Steve Shkoller (UC Davis
) which establishes the finite-time shock formation for the 3d isentropic
Euler equations with vorticity. We prove that for an open set of Sobolev-c
lass initial data\, there exist smooth solutions to the Euler equations wh
ich form a generic stable shock in finite time. The blow up time and locat
ion can be explicitly computed\, and solutions at the blow up time are smo
oth except for a single point\, where they are of cusp-type with Holder $C
^{1/3}$ regularity. Our proof is based on the use of modulated self-simila
r variables that are used to enforce a number of constraints on the blow u
p profile\, necessary to establish the stability in self-similar variables
of the generic shock profile.
DTSTART:20200214T203000Z
DTEND:20200214T213000Z
LAST-MODIFIED:20200212T133550Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis - Mathematical Physics
URL:https://www.ias.edu/node/107811
END:VEVENT
BEGIN:VEVENT
UID:17b45471-eb67-44cc-a25b-ae1118056823
DTSTAMP:20200401T030202Z
CREATED:20200113T154704Z
DESCRIPTION:Topic: Regularity of the free boundary for the two-phase Bernou
lli problem\n\nSpeaker: Guido De Philippis\, New York University\n\nI will
illustrate a recent result obtained in collaboration with L. Spolaor and
B. Velichkov concerning the regularity of the free boundaries in the two p
hase Bernoulli problems. The new main point is the analysis of the free bo
undary close to branch points\, where we show that it is given by the unio
n of two C^1 graphs. This complete the analysis started by Alt Caffarelli
Friedman in the 80’s.
DTSTART:20200214T220000Z
DTEND:20200214T230000Z
LAST-MODIFIED:20200212T133908Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis - Mathematical Physics
URL:https://www.ias.edu/node/107806
END:VEVENT
BEGIN:VEVENT
UID:24b2a995-d73f-4c38-a8b0-96798a3f4a47
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: An invitation to invariant theory\n\nSpeaker: Viswambhar
a Makam\, Member\, School of Mathematics\n\nVideo: https://video.ias.edu/c
sdm/2020/0217-ViswambharaMakam\n\nThis (mostly expository) talk will be ab
out invariant theory viewed through the lens of computational complexity.
Invariant theory is the study of symmetries\, captured by group actions\,
by polynomials that are “invariant.” Invariant theory has had a deep and l
asting influence in mathematics. In fact\, Lie theory and algebraic geomet
ry\, differential algebra and algebraic combinatorics are all offsprings o
f invariant theory. In our century\, exciting connections between invarian
t theory and fundamental problems in complexity (such as P vs NP) have bee
n uncovered thanks to the Geometric Complexity Theory program.\n\nI will d
iscuss various topics such as degree bounds\, null cones\, orbits and thei
r closures\, and also mention more recent results on non-commutative ident
ity testing. There will be many examples\, and the goal is to familiarize
the audience with the problems\, goals and techniques of interest as well
as the connections with complexity. This talk is intended to provide motiv
ation\, background\, and context for the talk next week. No special backgr
ound will be assumed.
DTSTART:20200218T153000Z
DTEND:20200218T173000Z
LAST-MODIFIED:20200227T015643Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100716
END:VEVENT
BEGIN:VEVENT
UID:96bb3060-3b75-4a12-9ae6-e364af9da15d
DTSTAMP:20200401T030202Z
CREATED:20191203T183001Z
DESCRIPTION:Topic: Compositional generalization in minds and machines\n\nSp
eaker: Brenden Lake\, New York University\n\nPeople learn in fast and flex
ible ways that elude the best artificial neural networks. Once a person le
arns how to “dax\,” they can effortlessly understand how to “dax twice” or
“dax vigorously” thanks to their compositional skills. In this talk\, we
examine how people and machines generalize compositionally in language-lik
e instruction learning tasks. Artificial neural networks have long been cr
iticized for lacking systematic compositionality (Fodor & Pylshyn\, 1988\;
Marcus\, 1998)\, but new architectures have been tackling increasingly am
bitious language tasks. In light of these developments\, we reevaluate the
se classic criticisms and find that artificial neural nets still fail spec
tacularly when systematic compositionality is required. We then show how p
eople succeed in similar few-shot learning tasks and find they utilize thr
ee inductive biases that can be incorporated into models. Finally\, we sho
w how more structured neural nets can acquire compositional skills and hum
an-like inductive biases through meta learning.
DTSTART:20200218T210000Z
DTEND:20200218T223000Z
LAST-MODIFIED:20200218T141735Z
LOCATION:Princeton Neurosciences Institute: Room A32
SUMMARY:IAS-PNI Seminar on ML and Neuroscience
URL:https://www.ias.edu/node/106921
END:VEVENT
BEGIN:VEVENT
UID:098acd09-f15c-43a2-baf2-5f3242764815
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:Topic: Regularization effect of gradient flow dynamics\n\nSpeak
er: Yaoyu Zhang\, Member\, Institute for Advanced Study\n\nI will introduc
e a math problem from deep learning regarding the regularization effect of
gradient flow dynamics for underdetermined problems.
DTSTART:20200219T230000Z
DTEND:20200220T003000Z
LAST-MODIFIED:20200213T135640Z
LOCATION:Dilworth Room
SUMMARY:Mathematical Conversations
URL:https://www.ias.edu/node/100986
END:VEVENT
BEGIN:VEVENT
UID:442f8339-d51a-4cf4-93e2-ac6b5efbe843
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:
DTSTART:20200220T150000Z
DTEND:20200220T170000Z
LAST-MODIFIED:20191007T174501Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101096
END:VEVENT
BEGIN:VEVENT
UID:3997c6bd-fd8e-44a8-b7fd-74db1d087e44
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:Topic: Geometric deep learning for functional protein design\n
\nSpeaker: Michael Bronstein\, Imperial College London\n\nProtein-based dr
ugs are becoming some of the most important drugs of the XXI century. The
typical mechanism of action of these drugs is a strong protein-protein int
eraction (PPI) between surfaces with complementary geometry and chemistry.
Over the past three decades\, large amounts of structural data on PPIs ha
s been collected\, creating opportunities for differentiable learning on t
he surface geometry and chemical properties of natural PPIs. Since the sur
face of these proteins has a non-Euclidean structure\, it is a natural fit
for geometric deep learning\, a novel class of machine learning technique
s generalising successful neural architectures to manifolds and graphs. In
the talk\, I will show how geometric deep learning methods can be used to
address various problems in functional protein design such as interface s
ite prediction\, pocket classification\, and search for surface motifs
DTSTART:20200220T170000Z
DTEND:20200220T183000Z
LAST-MODIFIED:20200214T155336Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/105981
END:VEVENT
BEGIN:VEVENT
UID:e8990fed-452b-4bf8-9ed0-ef4abfaaf8ef
DTSTAMP:20200401T030202Z
CREATED:20190607T212006Z
DESCRIPTION:Topic: Isolation of L^2 spectrum and application to Gan-Gross-P
rasad conjecture\n\nSpeaker: Yifeng Liu\, Yale University\n\nIn this talk\
, we will introduce a new technique of isolating the cuspidal part\, or mo
re general cuspidal components\, from the $L^2$ spectrum on the spectral s
ide of the trace formula. As an application\, we prove the global Gan-Gros
s-Prasad and Ichino-Ikeda conjectures for unitary groups in all stable cas
es.\n\nThis is a joint work with Raphaël Beuzart-Plessis\, Wei Zhang\, and
Xinwen Zhu.
DTSTART:20200220T213000Z
DTEND:20200220T223000Z
LAST-MODIFIED:20200214T155635Z
LOCATION:Princeton University\, Fine Hall 214
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/101331
END:VEVENT
BEGIN:VEVENT
UID:d7b21aa5-1c46-49ff-b297-46ad9465a041
DTSTAMP:20200401T030202Z
CREATED:20190607T202005Z
DESCRIPTION:Topic: Strong Average-Case Circuit Lower Bounds from Non-trivia
l Derandomization\n\nSpeaker: Lijie Chen\, Massachusetts Institute of Tech
nology\n\nVideo: https://video.ias.edu/csdm/2020/0224-LijieChen\n\nWe prov
e that\, unconditionally\, for all constants a\, NQP = NTIME[n^polylog(n)]
cannot be (1/2 + 2^(-log^a n) )-approximated by 2^(log^a n)-size ACC^0 ci
rcuits. Previously\, it was even open whether E^NP can be (1/2+1/sqrt(n))-
approximated by AC^0[2] circuits. As a straightforward application\, we ob
tain an infinitely often non-deterministic pseudorandom generator for poly
-size ACC^0 circuits with seed length 2^{log^eps n}\, for all eps > 0.\n\n
More generally\, we establish a connection showing that\, for a typical ci
rcuit class C\, non-trivial nondeterministic algorithms estimating the acc
eptance probability of a given S-size C circuit with an additive error 1/S
imply strong (1/2 + 1/n^{omega(1)}) average-case lower bounds for nondete
rministic time classes against C circuits. The existence of such (determin
istic) algorithms is much weaker than the widely believed conjecture Promi
seBPP = PromiseP.\n\nOur new results build on a line of recent works\, inc
luding [Murray and Williams\, STOC 2018]\, [Chen and Williams\, CCC 2019]\
, and [Chen\, FOCS 2019]. In particular\, it strengthens the corresponding
(1/2 + 1/polylog(n))-inapproximability average-case lower bounds in [Chen
\, FOCS 2019]. The two important technical ingredients are techniques from
Cryptography in NC^0 [Applebaum et al.\, SICOMP 2006]\, and Probabilistic
Checkable Proofs of Proximity with NC^1-computable proofs.\n\nThis is joi
nt work with Hanlin Ren from Tsinghua University.
DTSTART:20200224T160000Z
DTEND:20200224T170000Z
LAST-MODIFIED:20200227T015752Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/100646
END:VEVENT
BEGIN:VEVENT
UID:38c0717b-1049-4548-a023-c671163cb555
DTSTAMP:20200401T030202Z
CREATED:20170413T190001Z
DESCRIPTION:Topic: Direct and dual Information Bottleneck frameworks for De
ep Learning\n\nSpeaker: Tali Tishby\, The Hebrew University of Jerusalem\n
\nVideo: https://video.ias.edu/members/2020/0224-TaliTishby\n\nThe Informa
tion Bottleneck (IB) is an information theoretic framework for optimal rep
resentation learning. It stems from the problem of finding minimal suffici
ent statistics in supervised learning\, but has insightful implications fo
r Deep Learning. In particular\, its the only theory that gives concrete p
redictions on the different representations in each layer and their potent
ial computational benefit. I will review the theory and its new version\,
the dual Information Bottleneck\, related to the variational Information B
ottleneck which is gaining practical popularity. In particular\, I will di
scuss the implications of the critical points (phase transitions) of the I
B and dual IB and their importance for topological transformations of the
consecutive successively refinable representations.\n\nBased on joint work
s with Ravid Schwartz Ziv\, Noga Zaslavsky\, and Zoe Piran.
DTSTART:20200224T190000Z
DTEND:20200224T200000Z
LAST-MODIFIED:20200227T021126Z
LOCATION:Simonyi Hall 101
SUMMARY:Members' Seminar
URL:https://www.ias.edu/node/73426
END:VEVENT
BEGIN:VEVENT
UID:5ec09145-cd9f-42f6-b131-9f17b837ec11
DTSTAMP:20200401T030202Z
CREATED:20190607T210002Z
DESCRIPTION:Topic: Classification of n-component links with Khovanov homolo
gy of rank 2^n\n\nSpeaker: Boyu Zhang\, Princeton University\n\nSuppose L
is a link with n components and the rank of Kh(L\;Z/2) is 2^n\, we show th
at L can be obtained by disjoint unions and connected sums of Hopf links a
nd unknots. This result gives a positive answer to a question asked by Bat
son-Seed\, and generalizes the unlink detection theorem of Khovanov homolo
gy by Hedden-Ni and Batson-Seed. The proof relies on a new excision formul
a for the singular instanton Floer homology introduced by Kronheimer and M
rowka.\n\nThis is joint work with Yi Xie.
DTSTART:20200224T203000Z
DTEND:20200224T213000Z
LAST-MODIFIED:20200227T020925Z
LOCATION:Simonyi Hall 101
SUMMARY:Symplectic Dynamics/Geometry Seminar
URL:https://www.ias.edu/node/100881
END:VEVENT
BEGIN:VEVENT
UID:88ea15f5-ac21-4a2e-815c-f411c3693654
DTSTAMP:20200401T030202Z
CREATED:20190607T213002Z
DESCRIPTION:Topic: 'Observable events' and 'typical trajectories' in finite
and infinite dimensional dynamical systems\n\nSpeaker: Lai-Sang Young\, N
ew York University\; Distinguished Visiting Professor\, School of Mathemat
ics and Natural Sciences\n\nVideo: https://video.ias.edu/analysis/2020/022
4-Lai-SangYoung\n\nSome words in the title are between quotation marks bec
ause it is a matter of interpretation. For dynamical systems on finite dim
ensional spaces\, one often equates observable events with positive Lebesg
ue measure sets\, and invariant distributions that reflect the large-time
behaviors of positive Lebesgue measure sets of initial conditions (such as
Liouville measure for Hamiltonian systems) are considered to be especiall
y important. I will begin by introducing these concepts for general dynami
cal systems\, describing a simple dynamical picture that one might hope to
be true. This picture does not always hold\, unfortunately\, but a small
amount of random noise will bring it about. In the second part of my talk
I will consider infinite dimensional systems such as semi-flows arising fr
om dissipative evolutionary PDEs\, and discuss the extent to which the ide
as above can be generalized to infinite dimensions\, proposing a notion of
``typical solutions'.
DTSTART:20200224T220000Z
DTEND:20200224T230000Z
LAST-MODIFIED:20200227T020836Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis Seminar
URL:https://www.ias.edu/node/101446
END:VEVENT
BEGIN:VEVENT
UID:516223ed-cb3a-451f-84ff-5ec2cb058891
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: Is the variety of singular tuples of matrices a null con
e?\n\nSpeaker: Viswambhara Makam\, Member\, School of Mathematics\n\nVideo
: https://video.ias.edu/csdm/2020/0225-ViswambharaMakam\n\nThe following m
ulti-determinantal algebraic variety plays an important role in algebra an
d computational complexity theory: SING_{n\,m}\, consisting of all m-tuple
s of n x n complex matrices which span only singular matrices. In particul
ar\, an efficient deterministic algorithm testing membership in SING_{n\,m
} will imply super-polynomial circuit lower bounds\, a holy grail of the t
heory of computation.\n\nA sequence of recent works suggests such efficien
t algorithms for memberships in a general class of algebraic varieties\, n
amely the null cones of linear group actions. Can this be used for the pro
blem above? To address this we will use ideas and techniques from various
algebraic\, geometric\, and combinatorial fields. As always\, no backgroun
d will be assumed.\n\nThis is joint work with Avi Wigderson.
DTSTART:20200225T153000Z
DTEND:20200225T173000Z
LAST-MODIFIED:20200227T022052Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100711
END:VEVENT
BEGIN:VEVENT
UID:8e3a7251-7836-4b24-8880-7c62fb271220
DTSTAMP:20200401T030202Z
CREATED:20191203T183001Z
DESCRIPTION:Topic: Learning from Multiple Biased Sources\n\nSpeaker: Clayto
n Scott\, University of Michigan\n\nWhen high-quality labeled training dat
a are unavailable\, an alternative is to learn from training sources that
are biased in some way. This talk will cover my group’s recent work on thr
ee problems where a learner has access to multiple biased sources. First\,
we consider the problem of classification given multiple training data se
ts corrupted by label noise\, and describe a weighted empirical risk minim
ization strategy where the weights are optimized according to the degree o
f corruption of each source. Second\, we consider the Sim-to-real problem
in reinforcement learning\, where the learning agent has access to multipl
e biased simulators from which it can learn before being deployed in the r
eal world. We present a novel theoretical framework for Sim-to-real\, and
an algorithm whose real-world sample complexity is smaller than what is cu
rrently achievable when learning without access to simulators. Finally\, w
e consider the problem of clustering when observations are not iid\, but a
re organized into groups of realizations coming from the same (unknown) cl
uster. We discuss identifiability of this problem\, and present a practica
l algorithm for inferring the components of nonparametric mixture models f
rom paired observations under very general nonparametric assumptions on th
e underlying data distribution (in particular\, the mixture components can
have substantial overlap).
DTSTART:20200225T170000Z
DTEND:20200225T183000Z
LAST-MODIFIED:20200227T021938Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106926
END:VEVENT
BEGIN:VEVENT
UID:c142bd48-4c56-4fce-bac8-02b1cc111b30
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:Topic: Euler flow with odd symmetry\n\nSpeaker: Hyunju Kwon \,
Member\, School of Mathematics\n\nI’ll introduce the incompressible Euler
equations and talk about the solution’s behavior when the vorticity has od
d symmetry.
DTSTART:20200226T230000Z
DTEND:20200227T003000Z
LAST-MODIFIED:20200219T145628Z
LOCATION:Dilworth Room
SUMMARY:Mathematical Conversations
URL:https://www.ias.edu/node/100991
END:VEVENT
BEGIN:VEVENT
UID:c06b5c17-ea46-445a-b8c3-e0ac5c32e092
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:
DTSTART:20200227T150000Z
DTEND:20200227T170000Z
LAST-MODIFIED:20191007T174501Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101101
END:VEVENT
BEGIN:VEVENT
UID:ed2bcddc-f365-4ff6-99e3-49e18a68424a
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:Topic: Preference Modeling with Context-Dependent Salient Featu
res\n\nSpeaker: Laura Balzano\, University of Michigan\; Member\, School o
f Mathematics\n\nVideo: https://video.ias.edu/machinelearning/2020/0227-La
uraBalzano\n\nThis talk considers the preference modeling problem and addr
esses the fact that pairwise comparison data often reflects irrational cho
ice\, e.g. intransitivity. Our key observation is that two items compared
in isolation from other items may be compared based on only a salient subs
et of features. Formalizing this idea\, I will introduce our proposal for
a “salient feature preference model” and discuss sample complexity results
for learning the parameters of our model and the underlying ranking with
maximum likelihood estimation. I will also provide empirical results that
support our theoretical bounds\, illustrate how our model explains systema
tic intransitivity\, and show in this setting that our model is able to re
cover both pairwise comparisons and rankings for unseen pairs or items. Fi
nally I will share results on two data sets: the UT Zappos50K data set and
comparison data about the compactness of legislative districts in the US.
\n\nThis is joint work with Amanda Bower at the University of Michigan.
DTSTART:20200227T170000Z
DTEND:20200227T183000Z
LAST-MODIFIED:20200228T191108Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/105986
END:VEVENT
BEGIN:VEVENT
UID:3f95e570-181e-408e-86b5-3619ba9554c3
DTSTAMP:20200401T030202Z
CREATED:20200226T162657Z
DESCRIPTION:Topic: Spectral Independence in High-dimensional Expanders and
Applications to the Hardcore Model\n\nSpeaker: Kuikui Liu\, University of
Washington\n\nVideo: https://video.ias.edu/csdm/2020/0227-KuikuiLiu\n\nWe
say a probability distribution µ is spectrally independent if an associate
d correlation matrix has a bounded largest eigenvalue for the distribution
and all of its conditional distributions. We prove that if µ is spectrall
y independent\, then the corresponding high dimensional simplicial complex
is a local spectral expander. Using a line of recent works on mixing time
of high dimensional walks on simplicial complexes [KM17\; DK17\; KO18\; A
L19]\, this implies that the corresponding Glauber dynamics mixes rapidly
and generates (approximate) samples from µ. As an application\, we show th
at natural Glauber dynamics mixes rapidly (in polynomial time) to generate
a random independent set from the hardcore model up to the uniqueness thr
eshold. This improves the quasi-polynomial running time of Weitz’s determi
nistic correlation decay algorithm [Wei06] for estimating the hardcore par
tition function\, also answering a long-standing open problem of mixing ti
me of Glauber dynamics [LV97\; LV99\; DG00\; Vig01\; Eft+16].\n\nJoint wor
k with Nima Anari and Shayan Oveis Gharan
DTSTART:20200227T193000Z
DTEND:20200227T203000Z
LAST-MODIFIED:20200228T191311Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/109666
END:VEVENT
BEGIN:VEVENT
UID:11781253-7216-4b99-aa43-2596626eb2f4
DTSTAMP:20200401T030202Z
CREATED:20190607T212006Z
DESCRIPTION:Topic: A p-adic monodromy theorem for de Rham local systems\n\n
Speaker: Koji Shimizu\, Member\, School of Mathematics\n\nVideo: https://v
ideo.ias.edu/puias/2020/0227-KojiShimizu\n\nEvery smooth proper algebraic
variety over a p-adic field is expected to have semistable model after pas
sing to a finite extension. This conjecture is open in general\, but its a
nalogue for Galois representations\, the p-adic monodromy theorem\, is kno
wn. In this talk\, we will explain a generalization of this theorem to eta
le local systems on a smooth rigid analytic variety.
DTSTART:20200227T213000Z
DTEND:20200227T223000Z
LAST-MODIFIED:20200228T191411Z
LOCATION:Simonyi 101
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/101336
END:VEVENT
BEGIN:VEVENT
UID:8475578b-47a1-4fd2-9e2c-53947d3d6390
DTSTAMP:20200401T030202Z
CREATED:20200204T183212Z
DESCRIPTION:Topic: Rectifiability is necessary and sufficient\n\nSpeaker: S
vitlana Mayboroda\, University of Minnesota\n\nWe shall discuss optimal co
nditions on the geometry of the domain responsible for solvability of the
Dirichlet problem\, or absolute continuity of the harmonic measure with re
spect to the Lebesgue measure. In rough terms\, the question is: do Browni
an travelers see pieces of the boundary of the domain according to their L
ebesgue size\, or\, if not\, what do they see? In 1916 F. and M. Riesz pro
ved that in a simply connected planar domain rectifiability is sufficient\
, and over the years this result has been extended to higher dimensions. T
he centerpoint of our discussion will be a recently proved converse to F.&
M. Riesz theorem: rectifiability is also necessary for the absolute conti
nuity of harmonic measure with respect to the Lebesgue measure for $n-1$ d
imensional sets in $R^n$. We shall also touch upon a more general setting
of domains with lower dimensional boundaries.
DTSTART:20200228T203000Z
DTEND:20200228T213000Z
LAST-MODIFIED:20200225T212531Z
LOCATION:Simonyi 101
SUMMARY:Analysis - Mathematical Physics
URL:https://www.ias.edu/node/108871
END:VEVENT
BEGIN:VEVENT
UID:e351c53e-7672-42e2-bfd8-c2df2b23afa2
DTSTAMP:20200401T030202Z
CREATED:20200204T184237Z
DESCRIPTION:Topic: Dimerization and N ́eel order in different quantum spin
chains through a shared (classical) loop representation\n\nSpeaker: Michae
l Aizenman\, Princeton University\n\nThe spin-S quantum spin chain\, with
a projection-based antiferromagnetic interaction\, and the antiferromagnet
ic XXZ spin-1/2 chain exhibit different forms of translation symmetry brea
king. Yet they are related to a common system of random loops\, which is s
imilar to the collection of the boundaries of a classical\, two dimensiona
l\, Q-state random cluster model. We use these relations to establish the
exact conditions for symmetry breaking\, in a manner similar to the recent
proofs of the discontinuity of the phase transition for Q > 4. The quantu
m transitions in each case is from a gapless ground state to a pair of gap
ped and extensively distinct ground states.\n\nJoint work with H. Duminil-
Copin and S. Warzel).
DTSTART:20200228T220000Z
DTEND:20200228T230000Z
LAST-MODIFIED:20200220T201629Z
LOCATION:Simonyi 101
SUMMARY:Analysis - Mathematical Physics
URL:https://www.ias.edu/node/108876
END:VEVENT
BEGIN:VEVENT
UID:5f65ba6f-ad11-4441-a438-c898ebe7f220
DTSTAMP:20200401T030202Z
CREATED:20190607T202005Z
DESCRIPTION:Topic: An Improved Cutting Plane Method for Convex Optimization
\, Convex-Concave Games and its Applications\n\nSpeaker: Zhao Song\, Membe
r\, School of Mathematics\n\nVideo: https://video.ias.edu/csdm/2020/0302-Z
haoSong\n\nGiven a separation oracle for a convex set $K \subset \mathbb{R
}^n$ that is contained in a box of radius $R$\, the goal is to either comp
ute a point in $K$ or prove that $K$ does not contain a ball of radius $\e
psilon$. We propose a new cutting plane algorithm that uses an optimal $O(
n \log (\kappa))$ evaluations of the oracle and an additional $O(n^2)$ tim
e per evaluation\, where $\kappa = nR/\epsilon$.\n\n- This improves upon V
aidya's $O( \text{SO} \cdot n \log (\kappa) + n^{\omega+1} \log (\kappa))$
time algorithm [Vaidya\, FOCS 1989a] in terms of polynomial dependence on
$n$\, where $\omega\n- This improves upon Lee-Sidford-Wong's $O( \text{SO
} \cdot n \log (\kappa) + n^3 \log^{O(1)} (\kappa))$ time algorithm [Lee\,
Sidford and Wong\, FOCS 2015] in terms of dependence on $\kappa$.\n\nFor
many important applications in economics\, $\kappa = \Omega(\exp(n))$ and
this leads to a significant difference between $\log(\kappa)$ and poly$(\l
og (\kappa))$. We also provide evidence that the $n^2$ time per evaluation
cannot be improved and thus our running time is optimal.A bottleneck of p
revious cutting plane methods is to compute leverage scores\, a measure of
the relative importance of past constraints.Our result is achieved by a n
ovel multi-layered data structure for leverage score maintenance\, which i
s a sophisticated combination of diverse techniques such as random project
ion\, batched low-rank update\, inverse maintenance\, polynomial interpola
tion\, and fast rectangular matrix multiplication. Interestingly\, our met
hod requires a combination of different fast rectangular matrix multiplica
tion algorithms. Our algorithm not only works for the classical convex opt
imization setting\, but also generalizes to convex-concave games. We apply
our algorithm to improve the runtimes of many interesting problems\, e.g.
\, Linear Arrow-Debreu Markets\, Fisher Markets\, and Walrasian equilibriu
m.\n\nThis is a joint work with Haotian Jiang\, Yin Tat Lee\, and Sam Chiu
-wai Wong
DTSTART:20200302T160000Z
DTEND:20200302T170000Z
LAST-MODIFIED:20200401T000928Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/100651
END:VEVENT
BEGIN:VEVENT
UID:4d02b6ef-52f5-4e86-a379-f5400bb512c7
DTSTAMP:20200401T030202Z
CREATED:20170413T190001Z
DESCRIPTION:Topic: Lower Bounds in Complexity Theory\, Communication Comple
xity\, and Sunflowers\n\nSpeaker: Toniann Pitassi\, University of Toronto\
; Visiting Professor\, School of Mathematics\n\nVideo: https://video.ias.e
du/members/2020/0302-ToniannPitassi\n\nIn this talk I will discuss the Sun
flower Lemma and similar lemmas that prove (in various contexts) that a se
t/distribution can be partitioned into a structured part and a 'random-loo
king' part. I will introduce communication complexity as a key model for u
nderstanding computation and more generally for reasoning about informatio
n bottlenecks.\n\nI will describe our recent Lifting theorems which prove
in a very general way how efficient communication protocols can be well-ap
proximated by 'simple' protocols. Like Razborov's theorem\, the main compo
nent in the proof is a theorem quite similar to the Sunflower Lemma. We wi
ll show how to use Lifting to reprove Razborov's theorem as well as other
state-of-the-art circuit lower bounds. Time permitting\, we will present o
ther applications of Lifting (such as lower bounds for refuting random CNF
formulas and the resolution of the Alon-Saks-Seymour conjecture) and chal
lenges to proving nonmonotone circuit lower bounds.
DTSTART:20200302T190000Z
DTEND:20200302T200000Z
LAST-MODIFIED:20200401T001044Z
LOCATION:Simonyi Hall 101
SUMMARY:Members' Seminar
URL:https://www.ias.edu/node/73431
END:VEVENT
BEGIN:VEVENT
UID:89890c42-8e29-4645-a5f5-00c9ba2485f9
DTSTAMP:20200401T030202Z
CREATED:20190607T210002Z
DESCRIPTION:Topic: Twisted Calabi-Yau algebras and categories\n\nSpeaker: I
nbar Klang\, Columbia University\n\nThis talk will begin with a discussion
of the string topology category of a manifold M\; this was shown by Cohen
and Ganatra to be equivalent as a Calabi-Yau category to the wrapped Fuka
ya category of T*M. In joint work with Ralph Cohen\, we generalize the Cal
abi-Yau condition from chain complexes to spectra. I'll define our notion
of a twisted Calabi-Yau ring spectrum and discuss examples of interest.
DTSTART:20200302T203000Z
DTEND:20200302T213000Z
LAST-MODIFIED:20200227T193116Z
LOCATION:Princeton University\, Fine 224
SUMMARY:Symplectic Dynamics/Geometry Seminar
URL:https://www.ias.edu/node/100886
END:VEVENT
BEGIN:VEVENT
UID:82ef2508-f52a-4a20-8797-5d0685d5def7
DTSTAMP:20200401T030202Z
CREATED:20190607T213002Z
DESCRIPTION:Topic: Constancy of the dimension for non-smooth spaces with Ri
cci curvature bounded below via regularity of Lagrangian flows\n\nSpeaker:
Elia Bruè\, Scuola Normale Superiore di Pisa\n\nAfter a brief introductio
n to the theory of RCD(K\,N) spaces\, i.e. metric measure spaces with Ricc
i curvature bounded below by K and dimension bounded above by N\, I will p
resent the constancy of the dimension theorem in this setting. This result
generalizes the one obtained by Colding and Naber for Ricci limits. Its p
roof relies on a new regularity result for flow maps of Sobolev velocity f
ields.\n\nThis is based on a joint work with Daniele Semola.
DTSTART:20200302T220000Z
DTEND:20200302T230000Z
LAST-MODIFIED:20200219T210506Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis Seminar
URL:https://www.ias.edu/node/101451
END:VEVENT
BEGIN:VEVENT
UID:c83f737b-d3ea-430c-8572-a198e662801f
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: An introduction to Boolean Function Analysis\n\nSpeaker:
Dor Minzer\, Member\, School of Mathematics\n\nVideo: https://video.ias.e
du/csdm/2020/0303-DorMinzer\n\nWe will discuss some of the basic principle
s and results in the study of Boolean-valued functions over the discrete h
ypercube using discrete Fourier analysis. In particular\, we will talk abo
ut basic concepts\, the hypercontractive inequality and the KKL theorem. T
ime permitting\, we will discuss the Fourier-Entropy Conjecture and mentio
n some recent progress towards it.\n\nThe talk is self-contained and no sp
ecial background will be assumed.
DTSTART:20200303T153000Z
DTEND:20200303T173000Z
LAST-MODIFIED:20200401T001126Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100706
END:VEVENT
BEGIN:VEVENT
UID:18ad5036-c175-4e8e-9c8b-b3af8b4ac3c0
DTSTAMP:20200401T030202Z
CREATED:20190611T180002Z
DESCRIPTION:Topic: What Noisy Convex Quadratics Tell Us about Neural Net Tr
aining\n\nSpeaker: Roger Grosse\, University of Toronto\; Member\, School
of Mathematics\n\nI’ll discuss the Noisy Quadratic Model\, the toy problem
of minimizing a convex quadratic function with noisy gradient observation
s. While the NQM is simple enough to have closed-form dynamics for a varie
ty of optimizers\, it gives a surprising amount of insight into neural net
training phenomena. First\, we’ll look at the problem of adapting learnin
g rates using meta-descent (i.e. differentiating through the training dyna
mics). The NQM illuminates why short-horizon meta-descent objectives vastl
y underestimate the optimal learning rate. Second\, we’ll study how the be
havior of various neural net optimizers depends on the batch size. When is
it beneficial to use preconditioning? Momentum? Parameter averaging? Lear
ning rate schedules? The NQM can generate predictions in seconds which see
m to capture the qualitative behavior of large-scale classification conv n
ets and transformers.
DTSTART:20200303T170000Z
DTEND:20200303T183000Z
LAST-MODIFIED:20200226T153652Z
LOCATION:White-Levy
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/101591
END:VEVENT
BEGIN:VEVENT
UID:84409797-42da-4780-9b12-3ef10aaae633
DTSTAMP:20200401T030202Z
CREATED:20191212T132004Z
DESCRIPTION:
DTSTART:20200304T130000Z
DTEND:20200304T220000Z
LAST-MODIFIED:20191212T132004Z
LOCATION:White Levy
SUMMARY:Workshop on Machine Learning\, Theory and Method in the Social Scie
nces
URL:https://www.ias.edu/node/107096
END:VEVENT
BEGIN:VEVENT
UID:30165cbc-c2fc-4248-8b64-d409c259509e
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:Topic: Rationality of algebraic varieties\n\nSpeaker: Alexander
Perry\, Member\, School of Mathematics\n\nI will survey what is known abo
ut the rationality of algebraic varieties\, including recent progress and
open questions. There will be a surprising connection to whiskey.
DTSTART:20200304T230000Z
DTEND:20200305T003000Z
LAST-MODIFIED:20200228T151339Z
LOCATION:Dilworth Room
SUMMARY:Mathematical Conversations
URL:https://www.ias.edu/node/100996
END:VEVENT
BEGIN:VEVENT
UID:fd1017ff-b323-46ba-b5b0-161f7cba7392
DTSTAMP:20200401T030202Z
CREATED:20191212T132004Z
DESCRIPTION:
DTSTART:20200305T130000Z
DTEND:20200305T220000Z
LAST-MODIFIED:20191212T132004Z
LOCATION:White Levy
SUMMARY:Workshop on Machine Learning\, Theory and Method in the Social Scie
nces
URL:https://www.ias.edu/node/107101
END:VEVENT
BEGIN:VEVENT
UID:6147b5f8-4d55-4c58-aad4-725f35024afd
DTSTAMP:20200401T030202Z
CREATED:20191218T164502Z
DESCRIPTION:
DTSTART:20200305T130000Z
DTEND:20200305T163000Z
LAST-MODIFIED:20191218T164502Z
LOCATION:West Lecture
SUMMARY:Workshop on Machine Learning\, Theory and Method in the Social Scie
nces/ Alternate
URL:https://www.ias.edu/node/107236
END:VEVENT
BEGIN:VEVENT
UID:4f5f8f39-709e-4232-b93f-cece7ccbbf0e
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:
DTSTART:20200305T150000Z
DTEND:20200305T170000Z
LAST-MODIFIED:20191007T174501Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101106
END:VEVENT
BEGIN:VEVENT
UID:3eed0c20-1d0b-4ee8-be01-4a5108e79ba3
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:Topic: Understanding Deep Neural Networks: From Generalization
to Interpretability\n\nSpeaker: Gitta Kutyniok\, Technische Universität Be
rlin\n\nVideo: https://video.ias.edu/machinelearning/2020/0305-GittaKutyni
ok\n\nDeep neural networks have recently seen an impressive comeback with
applications both in the public sector and the sciences. However\, despite
their outstanding success\, a comprehensive theoretical foundation of dee
p neural networks is still missing.\n\nFor deriving a theoretical understa
nding of deep neural networks\, one main goal is to analyze their generali
zation ability\, i.e. their performance on unseen data sets. In case of gr
aph convolutional neural networks\, which are today heavily used\, for ins
tance\, for recommender systems\, already the generalization capability to
signals on graphs unseen in the training set\, typically coined transfera
bility\, has not been thoroughly analyzed yet. As one answer to this quest
ion\, in this talk we will show that spectral graph convolutional neural n
etworks are indeed transferable\, thereby also debunking a common misconce
ption about this type of graph convolutional neural networks.\n\nIf such t
heoretical approaches fail or if one is just given a trained neural networ
k without knowledge of how it was trained\, interpretability approaches be
come necessary. Those aim to 'break open the black box' in the sense of id
entifying those features from the input\, which are most relevant for the
observed output. Aiming to derive a theoretically founded approach to this
problem\, we introduced a novel approach based on rate-distortion theory
coined Rate-Distortion Explanation (RDE)\, which not only provides state-o
f-the-art explanations\, but in addition allows first theoretical insights
into the complexity of such problems. In this talk we will discuss this a
pproach and show that it also gives a precise mathematical meaning to the
previously vague term of relevant parts of the input.
DTSTART:20200305T170000Z
DTEND:20200305T183000Z
LAST-MODIFIED:20200401T001224Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/105991
END:VEVENT
BEGIN:VEVENT
UID:4093e040-3c85-4996-95e2-888e779a3851
DTSTAMP:20200401T030202Z
CREATED:20190607T212006Z
DESCRIPTION:Topic: Euler system\, Eisenstein congruences and the p-adic Lan
glands correspondence\n\nSpeaker: Eric Urban\, Columbia University\n\nI wi
ll discuss how the use of the p-adic Langlands correspondence for GL_2(Q_p
) allows to study Eisenstein congruences of various weight\, level and slo
pes in order to construct Euler systems. I will discuss the GL(2)-case wit
h some details and give some hints how to treat the symplectic case in the
ordinary setting.
DTSTART:20200305T213000Z
DTEND:20200305T223000Z
LAST-MODIFIED:20200225T155302Z
LOCATION:Princeton University\, Fine Hall 214
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/101341
END:VEVENT
BEGIN:VEVENT
UID:99afefc5-8aff-4622-9ae6-9efeba5b8f82
DTSTAMP:20200401T030202Z
CREATED:20190607T202005Z
DESCRIPTION:Topic: Learning from Censored and Dependent Data\n\nSpeaker: Co
nstantinos Daskalakis \, Massachusetts Institute of Technology\; Member\,
School of Mathematics\n\nVideo: https://video.ias.edu/csdm/2020/0309-Const
antinosDaskalakis\n\nMachine Learning is invaluable for extracting insight
s from large volumes of data. A key assumption enabling many methods\, how
ever\, is having access to training data comprising independent observatio
ns from the entire distribution of relevant data. In practice\, data is co
mmonly missing due to measurement limitations\, legal restrictions\, or da
ta collection and sharing practices. Moreover\, observations are commonly
collected on a network\, a spatial or a temporal domain and may be intrica
tely dependent. Training on data that is censored or dependent is known to
lead to Machine Learning models that are biased.\n\nIn this talk\, we ove
rview recent work on learning from censored and dependent data. We propose
a learning framework which is broadly applicable\, and instantiate this f
ramework to obtain computationally and statistically efficient methods for
linear\, and logistic regression from censored or dependent samples\, in
high dimensions. Our findings are enabled through connections to Statistic
al Physics\, Concentration and Anti-concentration of measure\, and propert
ies of Stochastic Gradient Descent\, and advance some classical challenges
in Statistics and Econometrics. (I will overview works with Dagan\, Dikka
la\, Gouleakis\, Ilyas\, Jayanti\, Kontonis\, Panageas\, Rohatgi\, Tzamos\
, Zampetakis)
DTSTART:20200309T150000Z
DTEND:20200309T160000Z
LAST-MODIFIED:20200401T001326Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/100656
END:VEVENT
BEGIN:VEVENT
UID:0637734a-ced9-4633-938f-c918d5e19768
DTSTAMP:20200401T030202Z
CREATED:20170413T190001Z
DESCRIPTION:Topic: Towards a mathematical model of the brain\n\nSpeaker: La
i-Sang Young\, New York University\; Distinguished Visiting Professor\, Sc
hool of Mathematics & Natural Sciences\n\nVideo: https://video.ias.edu/mem
bers/2020/0309-Lai-SangYoung\n\nStriving to make contact with mathematics
and to be consistent with neuroanatomy at the same time\, I propose an ide
alized picture of the cerebral cortex consisting of a hierarchical network
of brain regions each further subdivided into interconnecting layers not
unlike those in artificial neural networks. Each layer is idealized as a 2
D sheet of neurons\, spatially homogeneous with primarily local interactio
ns\, a setup reminiscent of that in statistical mechanics. Zooming into lo
cal circuits\, one gets into the domain of dynamical systems. Here the dyn
amics are characterized by competition and balance between two 'opposing'
groups of agents (Excitatory and Inhibitory neurons)\, trying to negotiate
local dynamic equilibria in response to spatially inhomogeneous external
stimuli. I will illustrate some of these ideas using a biologically realis
tic model of the monkey visual cortex\, built and benchmarked to reproduce
visual phenomena with the ultimate aim of explaining cortical mechanisms.
\n\nThe modeling work discussed is in collaboration with Bob Shapley and L
ogan Chariker.
DTSTART:20200309T180000Z
DTEND:20200309T190000Z
LAST-MODIFIED:20200401T001514Z
LOCATION:Simonyi Hall 101
SUMMARY:Members' Seminar
URL:https://www.ias.edu/node/73436
END:VEVENT
BEGIN:VEVENT
UID:e231c4ce-43f0-4072-af32-6b0b039335e3
DTSTAMP:20200401T030202Z
CREATED:20190607T210002Z
DESCRIPTION:Topic: Packing and squeezing Lagrangian tori\n\nSpeaker: Richar
d Hind\, University of Notre Dame\n\nVideo: https://video.ias.edu/sympdyng
eo/2020/0309-RichardHind\n\nWe will ask how many Lagrangian tori\, say wit
h an integral area class\, can be `packed' into a given symplectic manifol
d. Similarly\, given an arrangement of such tori\, like the integral produ
ct tori in Euclidean space\, one can ask about the symplectic size of the
complement. The talk will describe some constructions of balls and Lagrang
ian tori which show the size is larger than expected.\n\nThis is based on
joint work with Ely Kerman.
DTSTART:20200309T193000Z
DTEND:20200309T203000Z
LAST-MODIFIED:20200401T001440Z
LOCATION:Simonyi Hall 101
SUMMARY:Symplectic Dynamics/Geometry Seminar
URL:https://www.ias.edu/node/100891
END:VEVENT
BEGIN:VEVENT
UID:efee79ca-1ce1-46ef-9999-c7f79af7d9fa
DTSTAMP:20200401T030202Z
CREATED:20190607T213002Z
DESCRIPTION:Topic: Higher order rectifiability and Reifenberg parametrizati
ons\n\nSpeaker: Silvia Ghinassi\, Member\, School of Mathematics\n\nVideo:
https://video.ias.edu/analysis/2020/0309-SilviaGhinassi\n\nWe provide geo
metric sufficient conditions for Reifenberg flat sets of any integer dimen
sion in Euclidean space to be parametrized by a Lipschitz map with Hölder
derivatives. The conditions use a Jones type square function and all state
ments are quantitative in that the Hölder and Lipschitz constants of the p
arametrizations depend on such a function. We use these results to prove s
ufficient conditions for higher order rectifiability of sets and measures.
Key tools for the proof come from Guy David and Tatiana Toro’s parametriz
ation of Reifenberg flat sets in the Hölder and Lipschitz categories. If t
ime allows\, we will discuss some related work in progress and an example
that shows that the conditions are not necessary.
DTSTART:20200309T210000Z
DTEND:20200309T220000Z
LAST-MODIFIED:20200401T001409Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis Seminar
URL:https://www.ias.edu/node/101456
END:VEVENT
BEGIN:VEVENT
UID:cbcdcb17-26ed-4cfb-a902-9c4dc7c7e436
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: Introduction to high dimensional expanders \n\nSpeaker:
Irit Dinur\, Weizmann Institute of Science\; Visiting Professor\, School o
f Mathematics\n\nVideo: https://video.ias.edu/csdm/2020/0310-IritDinur\n\n
High dimensional expansion generalizes edge and spectral expansion in grap
hs to hypergraphs (viewed as higher dimensional simplicial complexes). It
is a tool that allows analysis of PCP agreement rests\, mixing of Markov c
hains\, and construction of new error correcting codes. My talk will be de
voted to proving some nice relations between local and global expansion of
these objects.
DTSTART:20200310T143000Z
DTEND:20200310T163000Z
LAST-MODIFIED:20200401T001557Z
LOCATION:Simonyi Hall 101
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100701
END:VEVENT
BEGIN:VEVENT
UID:e916bf36-5d22-4e6d-8f1e-f7b122777400
DTSTAMP:20200401T030202Z
CREATED:20200310T135432Z
DESCRIPTION:Topic: Your Brain on Energy-Based Models: Applying and Scaling
EBMs to Problems of Interest to the Machine Learning Community Today\n\nSp
eaker: Will Grathwohl\, University of Toronto\n\nVideo: https://video.ias.
edu/machinelearning/2020/0310-WillGrathwohl\n\nIn this talk\, I will discu
ss my two recent works on Energy-Based Models. In the first work\, I discu
ss how we can reinterpret standard classification architectures as class c
onditional energy-based models and train them using recently proposed meth
ods for large-scale EBM training. We find that adding EBM training in this
way provides many benefits while negligibly affecting discriminative perf
ormance\, contrary to other hybrid generative/discriminative modeling appr
oaches. These benefits include improved calibration\, out-of-distribution
detection\, and robustness to adversarial examples.\n\nWhile methods for t
raining EBMs at scale have improved drastically\, they still lag behind ot
her classes of generative models such as flows and GANs. Further\, there h
as been little work on evaluating unnormalized models and comparing them w
ith other model classes. My next work addresses these issues with the Stei
n Discrepancy (SD). The SD is a measure of distance between two distributi
ons that is defined using samples from one distribution and an unnormalize
d model for the other. I explore how the stein discrepancy can be estimate
d in practice at scale and demonstrate applications related to goodness-of
-fit testing and unnormalized model evaluation. Next\, I show how my appro
ach for SD estimation can be turned into a GAN-like training objective for
EBMs which scales to high-dimensional data more gracefully than previous
approaches.
DTSTART:20200310T160000Z
DTEND:20200310T173000Z
LAST-MODIFIED:20200401T001623Z
LOCATION:Dilworth Room
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/110101
END:VEVENT
BEGIN:VEVENT
UID:08f50863-0c7b-4280-aba0-7dd62817d353
DTSTAMP:20200401T030202Z
CREATED:20200310T151124Z
DESCRIPTION:Topic: Improved Bounds on Minimax Regret under Logarithmic Loss
via Self-Concordance\n\nSpeaker: Blair Bilodaeu\, University of Toronto\n
\nWe study sequential probabilistic prediction on data sequences which are
not i.i.d.\, and even potentially generated by an adversary. At each roun
d\, the player assigns a probability distribution to possible outcomes and
incurs the log-likelihood of the observed outcome under said distribution
. Without assumptions on the data-generating process\, one cannot control
the total loss\, and so it is standard to study the regret\, i.e.\, the di
fference between the player’s total loss and the total loss of the single
best forecaster in some class of “experts”. We present a novel approach to
the open problem of bounding the minimax regret under logarithmic loss fo
r arbitrarily large expert classes by exploiting the self-concordance prop
erty of logarithmic loss. Our regret bound depends on the metric entropy o
f the expert class and matches previous best known results for arbitrary e
xpert classes. We observe a polynomial improvement in dependence on the ti
me horizon for high-dimensional classes.
DTSTART:20200311T200000Z
DTEND:20200311T213000Z
LAST-MODIFIED:20200310T151124Z
LOCATION:Simonyi 101
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/110106
END:VEVENT
BEGIN:VEVENT
UID:e0f0c15e-6107-416c-bd38-2b93c22dbd32
DTSTAMP:20200401T030202Z
CREATED:20200228T182702Z
DESCRIPTION:Topic: Gauge theory and low-dimensional topology\n\nSpeaker: Bo
yu Zhang\, Princeton University\n\nGauge theory studies partial differenti
al equations with a large group of local symmetries\, and it is the geomet
ric language to formulate many fundamental physical phenomena. Starting in
the 1980s\, mathematicians began to unravel surprising connections betwee
n gauge theory and low-dimensional topology\, and new developments continu
e to emerge. In this talk\, we will discuss some applications of gauge the
ory to the study of three and four-dimensional manifolds and to knot theor
y.
DTSTART:20200311T220000Z
DTEND:20200311T233000Z
LAST-MODIFIED:20200304T183741Z
LOCATION:Dilworth Room
SUMMARY:Mathematical Conversations
URL:https://www.ias.edu/node/109816
END:VEVENT
BEGIN:VEVENT
UID:a601a14e-b785-42bb-987e-0090594ab7a8
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:
DTSTART:20200312T140000Z
DTEND:20200312T160000Z
LAST-MODIFIED:20191007T174501Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101111
END:VEVENT
BEGIN:VEVENT
UID:b61b8b07-d922-4326-a7b2-ad9c48fbba65
DTSTAMP:20200401T030202Z
CREATED:20200311T192547Z
DESCRIPTION:Topic: An explicit realization of stable L-packets on (split) c
lassical groups\n\nSpeaker: David Soudry\, Tel Aviv University\n\n**Please
note: All in-person IAS seminars have been cancelled. Seminar will be liv
estreamed. Please follow link: https://youtu.be/vZz1SFWHiog\n\nRecently\,
Cai\, Friedberg\, Ginzburg and Kaplan generalized the doubling method of P
iatetski-Shapiro and Rallis. They found global integrals\, which represent
the standard L-function for pairs of irreducible\, automorphic\, cuspidal
representations \pi - on a (split) classical group G\, and \tau - on GL(n
). The representation \pi need not have any particular model (such as a Wh
ittaker model\, or Bessel model\, etc.). These integrals suggest an explic
it descent map (an inverse to Langlands functorial lift) from GL(n) to G (
appropriate G). I will show that a certain Fourier coefficient applied to
a residual Eisenstein series\, induced from a Speh representation\, corres
ponding to a self-dual \tau\, is equal to the direct sum of irreducible cu
spidal representations \sigma \otimes \sigma'\, on G x G \, where \sigma r
uns over all irreducible cuspidal representations\, which lift to \tau (\s
igma' is the complex conjugate of an outer conjugation of \sigma). This is
a joint work with David Ginzburg.
DTSTART:20200312T203000Z
DTEND:20200312T213000Z
LAST-MODIFIED:20200312T171540Z
LOCATION:Seminar cancelled. Event to be livestreamed. Link posted by 3:30P
M
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/110111
END:VEVENT
BEGIN:VEVENT
UID:200526d6-720f-4efa-91a4-a3f059c3f2dc
DTSTAMP:20200401T030202Z
CREATED:20200313T194133Z
DESCRIPTION:Topic: Feature purification: How adversarial training can perfo
rm robust deep learning\n\nSpeaker: Yuanzhi Li\, Carnegie Mellon Universit
y\n\nVideo: https://video.ias.edu/csdm/2020/0316-YuanzhiLi\n\nTo connect t
o the CSDM Seminar via Zoom\, please do the following:\n1 - If you have Zo
om installed on your device\, enter the following meeting ID: 360043913 \,
click Join Meeting.\n2 - If you do not have Zoom installed\, click the fo
llowing link to download and install: https://theias.zoom.us/j/360043913\n
3 - Once installed\, click the very same link to connect to the meeting.\n
\nWhy deep learning models\, trained on many machine learning tasks\, can
obtain nearly perfect predictions of unseen data sampled from the same dis
tribution but are extremely vulnerable to small perturbations of the input
? How can adversarial training improve the robustness of the neural networ
ks over such perturbations? In this work\, we developed a new principle ca
lled 'feature purification''. Mathematically\, the principle states that f
or a multi-layer ReLU neural network\, let w_i^0 be the weight of the i-th
neuron at initialization\, w_i be its weight after clean training (starti
ng from w_i^0)\, and w_i' as its weight after adversarial training (starti
ng from w_i)\, then for most of the neurons\, there are scaling factors \b
eta_i and vectors v_i such that w_i = \beta_i w_i' + v_i with ||\beta_i w_
i' || > ||v_i||\, while both w_i and w_i' have nearly zero correlations wi
th w_i^0.\n\nConceptually\, the principle says that while both clean train
ing and adversarial training will discover certain features fundamentally
different from the initial values\, clean training actually already discov
ers a big portion of the 'robust features' obtained by adversarial trainin
g. Thus\, instead of needing to learn new 'robust features' or completely
remove 'non-robust features'\, adversarial training can simply robustify t
he neural network by purifying the clean trained features.\n\nIn this work
\, we present both experiments demonstrating the principle\, and a mathema
tical model formally proving it. In particular\, we show that for certain
binary classification tasks\, when we train a two-layer ReLU network using
SGD\, (1). Both clean training and adversarial training will learn featur
es with nearly zero correlations with the (random) initialization. (2). Af
ter clean training\, the network will provably achieve > 99% clean accurac
y but 99% robust accuracy in polynomial samples and running time by simply
'purifying' the clean trained features. Our lower bound that clean traine
d network has\n\nOur work also sheds light on why 'adversary examples are
not bugs': They are not because neural networks are over-fitting to the da
ta set due to the high complexity of the models with insufficiently many t
raining data\, but rather are an intrinsic property of the gradient-descen
t type training algorithms.
DTSTART:20200316T150000Z
DTEND:20200316T160000Z
LAST-MODIFIED:20200401T001742Z
LOCATION:https://theias.zoom.us/j/360043913
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/110126
END:VEVENT
BEGIN:VEVENT
UID:cd96fe2a-aead-4c20-a353-bb86febb2f32
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: Sharp Thresholds and Extremal Combinatorics\n\nSpeaker:
Dor Minzer\, Member\, Institute for Advanced Study\n\nVideo: https://video
.ias.edu/csdm/2020/0317-DorMinzer\n\nTo connect to the CSDM Seminar via Zo
om\, please do the following:\n1 - If you have Zoom installed on your devi
ce\, enter the following meeting ID: 360043913 \, click Join Meeting.\n2 -
If you do not have Zoom installed\, click the following link to download
and install: https://theias.zoom.us/j/360043913\n3 - Once installed\, clic
k the very same link to connect to the meeting.\n\nConsider the p-biased d
istribution over ${0\,1}^n$\, in which each coordinate independently is sa
mpled according to a $p$-biased bit. A sharp-threshold result studies the
behavior of Boolean functions over the hypercube under different p-biased
measures\, and in particular whether the function experiences a phase tran
sition between two\, close p's. While the theory of sharp-thresholds is we
ll understood for p's that are bounded away from 0 and 1\, it is much less
so for values of p that are close to 0 or 1.\n\nIn this talk\, we will fi
rst discuss classical sharp-threshold results\, and demonstrate an applica
tion of them in Extremal Combinatorics [Dinur-Friedgut]. We will then disc
uss newer sharp-threshold results. Time permitting\, we will mention appli
cations to two problems in extremal combinatorics: the Erdos matching conj
ecture\, and the problem of determining the largest family of vectors in [
m]^n that avoids a fixed\, constant-sized intersections.\n\nBased on joint
works with Peter Keevash\, Noam Lifshitz and Eoin Long.
DTSTART:20200317T143000Z
DTEND:20200317T163000Z
LAST-MODIFIED:20200401T001815Z
LOCATION:https://theias.zoom.us/j/360043913
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100696
END:VEVENT
BEGIN:VEVENT
UID:db3ab2c6-292e-49d0-a104-f88cc3d39d0c
DTSTAMP:20200401T030202Z
CREATED:20191203T183001Z
DESCRIPTION:Topic: *Cancelled\n\n
DTSTART:20200317T160000Z
DTEND:20200317T173000Z
LAST-MODIFIED:20200316T174943Z
LOCATION:
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106931
END:VEVENT
BEGIN:VEVENT
UID:e621c2cb-7f46-464e-a3e4-21ff07bf315f
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:
DTSTART:20200319T140000Z
DTEND:20200319T160000Z
LAST-MODIFIED:20200316T134755Z
LOCATION:
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101116
END:VEVENT
BEGIN:VEVENT
UID:117dddbd-f97b-4b6f-8a73-779671a11826
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:Topic: *Cancelled\n\n
DTSTART:20200319T160000Z
DTEND:20200319T173000Z
LAST-MODIFIED:20200316T175419Z
LOCATION:
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106001
END:VEVENT
BEGIN:VEVENT
UID:f9157e23-081a-4c70-8ff9-6fad299d0fa4
DTSTAMP:20200401T030202Z
CREATED:20200204T185406Z
DESCRIPTION:Topic: *Cancelled\n\n
DTSTART:20200320T190000Z
DTEND:20200320T200000Z
LAST-MODIFIED:20200316T175347Z
LOCATION:Simonyi 101
SUMMARY:Analysis - Mathematical Physics
URL:https://www.ias.edu/node/108881
END:VEVENT
BEGIN:VEVENT
UID:c0e35492-ae1b-4923-8929-a9156789a9ab
DTSTAMP:20200401T030202Z
CREATED:20200204T185902Z
DESCRIPTION:Topic: *Cancelled\n\n
DTSTART:20200320T210000Z
DTEND:20200320T220000Z
LAST-MODIFIED:20200316T175516Z
LOCATION:Simonyi 101
SUMMARY:Analysis - Mathematical Physics
URL:https://www.ias.edu/node/108886
END:VEVENT
BEGIN:VEVENT
UID:12b1df70-fe8e-4649-bc67-b9d0a1937b83
DTSTAMP:20200401T030202Z
CREATED:20190607T202005Z
DESCRIPTION:Topic: Optimal tiling the Euclidean space using symmetric bodie
s\n\nSpeaker: Mark Braverman\, Princeton University\n\nWe say a body B til
es the space R^n if the shifted bodies (B+z)\, for z\in Z^n\, are all disj
oint and cover R^n.\n\nIn this talk\, we consider the problem of finding t
he least surface area a tiling body B can have\, for bodies B symmetric un
der coordinate permutation. This problem arises naturally in the study of
parallel repetition theorems\, which are an important component in many ha
rdness of approximation results. For general bodies\, the isoperimeric ine
quality implies that any tiling body B must have surface area at least \sq
rt{n}\, and somewhat surprisingly this bound is asymptotically tight.\n\nW
ith the symmetry requirement\, the trivial upper bound is O(n) for the uni
t cube\, and the trivial lower bound is still \sqrt{n}. In this talk we sh
ow that the answer for the symmetric case is \Theta(n/\sqrt{\log n}).\n\nW
e will discuss connections to the parallel repetition theorem. Specificall
y\, while the strong parallel repetition conjecture was refuted by Raz in
2008\, our result suggests that there might be important special cases whe
re it still applies.\n\nJoint work with Dor Minzer
DTSTART:20200323T150000Z
DTEND:20200323T160000Z
LAST-MODIFIED:20200323T105516Z
LOCATION:https://theias.zoom.us/j/360043913
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/100666
END:VEVENT
BEGIN:VEVENT
UID:0b06fdb1-7b9d-4175-a947-dd0001df7336
DTSTAMP:20200401T030202Z
CREATED:20190607T210002Z
DESCRIPTION:Topic: *Cancelled\n\n
DTSTART:20200323T193000Z
DTEND:20200323T203000Z
LAST-MODIFIED:20200316T175934Z
LOCATION:Simonyi Hall 101
SUMMARY:Symplectic Dynamics/Geometry Seminar
URL:https://www.ias.edu/node/100901
END:VEVENT
BEGIN:VEVENT
UID:3b6a2ee9-ffc1-46c5-93cd-6b70af3df077
DTSTAMP:20200401T030202Z
CREATED:20190607T214502Z
DESCRIPTION:Topic: *Cancelled\n\n
DTSTART:20200323T210000Z
DTEND:20200323T220000Z
LAST-MODIFIED:20200316T180024Z
LOCATION:Simonyi Hall 101
SUMMARY:Analysis Seminar
URL:https://www.ias.edu/node/101466
END:VEVENT
BEGIN:VEVENT
UID:4dbf99a2-9537-46dc-b26b-1d705161ab8c
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: High dimensional expansion and agreement testing\n\nSpea
ker: Irit Dinur\, Weizmann Institute of Science\; Visiting Professor\, Sch
ool of Mathematics\n\nVideo: https://video.ias.edu/csdm/2020/0324-IritDinu
r\n\nIn this talk I will describe the notion of 'agreement tests' that are
motivated by PCPs but stand alone as a combinatorial property-testing que
stion. I will show that high dimensional expanders support agreement tests
\, thereby derandomizing direct product tests in a very strong way.
DTSTART:20200324T143000Z
DTEND:20200324T163000Z
LAST-MODIFIED:20200401T001920Z
LOCATION:https://theias.zoom.us/j/360043913
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100691
END:VEVENT
BEGIN:VEVENT
UID:d8169b15-3f58-4eae-b15a-1ea3838fc20a
DTSTAMP:20200401T030202Z
CREATED:20200219T154157Z
DESCRIPTION:Topic: *Cancelled\n\n
DTSTART:20200325T220000Z
DTEND:20200325T233000Z
LAST-MODIFIED:20200319T133854Z
LOCATION:
SUMMARY:Mathematical Conversations
URL:https://www.ias.edu/node/109436
END:VEVENT
BEGIN:VEVENT
UID:a62a071b-2004-43bf-af6b-a0a344b1f074
DTSTAMP:20200401T030202Z
CREATED:20190607T211502Z
DESCRIPTION:Topic: *Cancelled\n\n
DTSTART:20200326T140000Z
DTEND:20200326T160000Z
LAST-MODIFIED:20200319T133931Z
LOCATION:Simonyi Hall 101
SUMMARY:Working Seminar on Nonabelian Hodge Theory
URL:https://www.ias.edu/node/101121
END:VEVENT
BEGIN:VEVENT
UID:dc5a79fc-fb60-4fe2-b7af-0967d18182a9
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:Topic: Margins\, perceptrons\, and deep networks\n\nSpeaker: Ma
tus Telgarsky\, University of Illinois\n\nVideo: https://video.ias.edu/mac
hinelearning/2020/0326-MatusTelgarsky\n\nThis talk surveys the role of mar
gins in the analysis of deep networks. As a concrete highlight\, it sketch
es a perceptron-based analysis establishing that shallow ReLU networks can
achieve small test error even when they are quite narrow\, sometimes even
logarithmic in the sample size and inverse target error. The analysis and
bounds depend on a certain nonlinear margin quantity due to Nitanda and S
uzuki\, and can lead to tight upper and lower sample complexity bounds.\n
\nJoint work with Ziwei Ji.
DTSTART:20200326T160000Z
DTEND:20200326T173000Z
LAST-MODIFIED:20200401T001950Z
LOCATION:https://illinois.zoom.us/j/741628827
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106006
END:VEVENT
BEGIN:VEVENT
UID:5c2f0682-d6b4-4fb9-be9d-6645d52c79cb
DTSTAMP:20200401T030202Z
CREATED:20200324T130757Z
DESCRIPTION:Topic: Wiles defect for Hecke algebras that are not complete in
tersections\n\nSpeaker: Chandrashekhar Khare\, University of California\n
\nIn his work on modularity theorems\, Wiles proved a numerical criterion
for a map of rings R->T to be an isomorphism of complete intersections. In
addition to proving modularity theorems\, this numerical criterion also i
mplies a connection between the order of a certain Selmer group and a spec
ial value of an L-function.\n\nIn this talk I will consider the case of a
Hecke algebra acting on the cohomology a Shimura curve associated to a qua
ternion algebra. In this case\, one has an analogous map of rings R->T whi
ch is known to be an isomorphism\, but in many cases the rings R and T fai
l to be complete intersections. This means that Wiles's numerical criterio
n will fail to hold.\n\nI will describe a method for precisely computing t
he extent to which the numerical criterion fails (i.e. the 'Wiles defect')
at a newform f which gives rise to an augmentation T -> Z_p. The defect t
urns out to be determined entirely by local information at the primes q di
viding the discriminant of the quaternion algebra at which the mod p repre
sentation arising from f is ``trivial''. (For instance if f corresponds to
a semistable elliptic curve\, then the local defect at q is related to th
e ``tame regulator'' of the Tate period of the elliptic curve at q.)\n\nTh
is is joint work with Gebhard Boeckle and Jeffrey Manning.
DTSTART:20200326T203000Z
DTEND:20200326T213000Z
LAST-MODIFIED:20200325T165543Z
LOCATION:https://theias.zoom.us/j/280491607
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/110231
END:VEVENT
BEGIN:VEVENT
UID:779c1492-2bce-45bf-af33-5f14b07219d0
DTSTAMP:20200401T030202Z
CREATED:20200327T115425Z
DESCRIPTION:Topic: Fragmentation pseudo-metrics and Lagrangian submanifolds
\n\nSpeaker: Octav Cornea \, Université de Montréal\n\nVideo: https://vide
o.ias.edu/Symplectic/2020/0327-OctavCornea\n\nThe purpose of the talk is t
o discuss a class of pseudo-metrics that can be defined on the set of obje
cts of a triangulated category whose morphisms are endowed with a notion o
f weight. In case the objects are Lagrangian submanifolds (possibly immers
ed) there are a some natural ways to define such pseudo-metrics and\, if t
he class of Lagrangian submanifolds is unobstructed\, these pseudo-metrics
are non-degenerate and extend in a natural way the Hofer distance.\n\nThe
talk is based on joint work with P. Biran and with E. Shelukhin.
DTSTART:20200327T130000Z
DTEND:20200327T140000Z
LAST-MODIFIED:20200401T002049Z
LOCATION:https://zoom.us/j/496042680
SUMMARY:Symplectic Seminar
URL:https://www.ias.edu/node/110276
END:VEVENT
BEGIN:VEVENT
UID:7e03b2b1-6b3d-4350-95ae-18d54f6f21ad
DTSTAMP:20200401T030202Z
CREATED:20190607T202005Z
DESCRIPTION:Topic: CSPs with Global Modular Constraints: Algorithms and Har
dness via Polynomial Representations\n\nSpeaker: Sivakanth Gopi\n\nVideo:
https://video.ias.edu/csdm/2020/0330-SivakanthGopi\n\nA theorist's dream i
s to show that hard instances/obstructions for an (optimal) algorithm can
be used as gadgets to prove tight hardness reductions (which proves optima
lity of the algorithm). An example of such a result is that of Prasad Ragh
avendra who showed that for any constraint satisfaction problem (CSP)\, th
ere is an SDP which achieves the best possible approximation factor assumi
ng UGC. We show that a similar phenomenon occurs in CSPs with global modul
ar constraints.\n\nA global modular constraint is a linear equation (in al
l the variables) modulo M for some fixed constant M. Take any polytime sol
vable Boolean CSP like 2SAT or LIN_2 or HORNSAT. Let's look at LIN_2 for c
oncreteness\, it is a system of linear equations modulo 2. By Gaussian eli
mination over F_2\, we can find in polynomial time if it is satisfiable. N
ow suppose we are given an additional linear equation modulo M (for some f
ixed constant M not equal to 2). Can we find in polynomial time if the new
system is satisfiable? Surprisingly\, we show that the answer depends on
the prime factorization of M! For example\, it is possible for M=3\, but n
ot for M=15. We show that for such problems\, the obstructions to a natura
l algorithm and gadgets useful for hardness reduction (assuming ETH) are c
losely connected to complexity (like degree or sparsity) of polynomial rep
resentations of OR mod M. Thus showing good lower bounds on the complexity
of such polynomials imply good algorithms and constructing low complexity
representations imply good hardness results. We also show some connection
s to submodular minimization with global modular constraints.\n\nJoint wor
k with Joshua Brakensiek and Venkatesan Guruswami.
DTSTART:20200330T150000Z
DTEND:20200330T160000Z
LAST-MODIFIED:20200401T002203Z
LOCATION:https://theias.zoom.us/j/360043913
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/100671
END:VEVENT
BEGIN:VEVENT
UID:9b31d261-7b90-4f09-91a9-e629de20995f
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: High dimensional expansion and agreement testing\n\nSpea
ker: Irit Dinur\, Weizmann Institute of Science\; Visiting Professor\, Sch
ool of Mathematics\n\nVideo: https://video.ias.edu/csdm/2020/0331-IritDinu
r\n\nIn this talk I will describe the notion of 'agreement tests' that are
motivated by PCPs but stand alone as a combinatorial property-testing que
stion. I will show that high dimensional expanders support agreement tests
\, thereby derandomizing direct product tests in a very strong way.
DTSTART:20200331T143000Z
DTEND:20200331T163000Z
LAST-MODIFIED:20200401T002303Z
LOCATION:https://theias.zoom.us/j/360043913
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100686
END:VEVENT
BEGIN:VEVENT
UID:f63fb3f4-9f6b-41a9-b4d1-311e85386726
DTSTAMP:20200401T030202Z
CREATED:20200326T132006Z
DESCRIPTION:Topic: Some Recent Insights on Transfer Learning\n\nSpeaker: Sa
mory Kpotufe\, Columbia University\; Member\, School of Mathematics\n\nVid
eo: https://video.ias.edu/machinelearning/2020/0331-SamoryKpotufe\n\nA com
mon situation in Machine Learning is one where training data is not fully
representative of a target population due to bias in the sampling mechanis
m or high costs in sampling the target population\; in such situations\, w
e aim to ’transfer’ relevant information from the training data (a.k.a. so
urce data) to the target application. How much information is in the sourc
e data? How much target data should we collect if any? These are all pract
ical questions that depend crucially on 'how far' the source domain is fro
m the target. However\, how to properly measure 'distance' between source
and target domains remains largely unclear.\n\nIn this talk we will argue
that much of the traditional notions of 'distance' (e.g. KL-divergence\, e
xtensions of TV such as D_A discrepancy\, density-ratios\, Wasserstein dis
tance) can yield an over-pessimistic picture of transferability. Instead\,
we show that some new notions of 'relative dimension' between source and
target (which we simply term 'transfer-exponents') capture a continuum fro
m easy to hard transfer. Transfer-exponents uncover a rich set of situatio
ns where transfer is possible even at fast rates\, helps answer questions
such as the benefit of unlabeled or labeled target data\, yields a sense o
f optimal vs suboptimal transfer heuristics\, and have interesting implica
tions for related problems such as multi-task learning.\n\nFinally\, trans
fer-exponents provide guidance as to *how* to efficiently sample target da
ta so as to guarantee improvement over source data alone. We illustrate th
ese new insights through various simulations on controlled data\, and on t
he popular CIFAR-10 image dataset.\n\nThe talk is based on work with Guill
aume Martinet\, and ongoing work with Steve Hanneke.
DTSTART:20200331T160000Z
DTEND:20200331T173000Z
LAST-MODIFIED:20200401T002231Z
LOCATION:https://theias.zoom.us/j/384099138
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/110246
END:VEVENT
BEGIN:VEVENT
UID:9fb2e8af-8fb0-4983-81d0-9fe453c3ec4e
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:Topic: Learning Controllable Representations\n\nSpeaker: Richar
d Zemel\, University of Toronto\; Member\, School of Mathematics\n\nAs dee
p learning systems become more prevalent in real-world applications it is
essential to allow users to exert more control over the system. Exerting s
ome structure over the learned representations enables users to manipulate
\, interpret\, and even obfuscate the representations\, and may also impro
ve out-of-distribution generalization. In this talk I will discuss recent
work that makes some steps towards these goals\, aiming to represent the i
nput in a factorized form\, with dimensions of the latent space partitione
d into task-dependent and task-independent components. I will focus on an
approach that utilizes a reversible formulation of a network. Doing this r
eveals a lack of robustness: the system is too invariant to a wide range o
f task-relevant changes\, leading to a novel form of adversarial attacks w
hich we term excessive invariance. Our main contribution is an information
-theoretic objective that encourages the model to develop factorized repre
sentations. This provides the first approach tailored explicitly to overco
me excessive invariance and resulting vulnerabilities. It also facilitates
several applications\, including domain adaptation.
DTSTART:20200402T160000Z
DTEND:20200402T173000Z
LAST-MODIFIED:20200331T175332Z
LOCATION:https://theias.zoom.us/j/384099138
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106011
END:VEVENT
BEGIN:VEVENT
UID:ed735d89-cbe4-4b3d-8d54-c2d46627ab79
DTSTAMP:20200401T030202Z
CREATED:20190607T212006Z
DESCRIPTION:Topic: Density conjecture for horizontal families of lattices i
n SL(2)\n\nSpeaker: Mikolaj Fraczyk\, Member\, School of Mathematics\n\nLe
t G be a real semi-simple Lie group with an irreducible unitary representa
tion \pi. The non-temperedness of \pi is measured by the parameter p(\pi)
which is defined as the infimum of p\geq 2 such that \pi has matrix coeffi
cients in L^p(G). Sarnak and Xue conjectured that for any arithmetic latti
ce \Gamma \subset G and principal congruence subgroup \Gamma(q)\subset \Ga
mma\, the multiplicity of \pi in L^2(G/\Gamma(q)) is at most O(V(q)^{2/p(\
pi)+\epsilon}) where V(q) is the covolume of \Gamma(q). In some contexts s
uch estimate is a decent substitute for the Ramanujan conjecture. For G of
real rank 1 Sarnak and Xue translate the estimate into a diophantine coun
ting problem which they managed to solve SL(2\,R) and SL(2\,C).\n\nIn this
talk I will explain how one can get the same multiplicity bounds for fami
lies of pairwise non-commensurable lattices in G=SL(2\,R)\,SL(2\,C) given
as unit groups of maximal orders of quaternion algebras over number fields
(“horizontal families”). Namely: m(\pi\,\Gamma)\n\nTalk is based on a joi
nt work with Gergely Harcos\, Peter Maga and Djordje Milicevic.
DTSTART:20200402T203000Z
DTEND:20200402T213000Z
LAST-MODIFIED:20200327T154346Z
LOCATION:https://theias.zoom.us/j/959183254
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/101361
END:VEVENT
BEGIN:VEVENT
UID:d218e4e6-d5de-4395-b95c-d879c8fc5b05
DTSTAMP:20200401T030202Z
CREATED:20200327T183132Z
DESCRIPTION:Topic: The Palais-Smale Theorem and the Solution of Hilbert’s 2
3 Problem\n\nSpeaker: Karen Uhlenbeck\n\n
DTSTART:20200406T180000Z
DTEND:20200406T190000Z
LAST-MODIFIED:20200327T183239Z
LOCATION:http://theias.zoom.us/j/119412864
SUMMARY:Members' Seminar
URL:https://www.ias.edu/node/110281
END:VEVENT
BEGIN:VEVENT
UID:4e5e8313-b29e-4791-9cd0-adc03ad54d75
DTSTAMP:20200401T030202Z
CREATED:20190607T203002Z
DESCRIPTION:Topic: To Be Announced\n\nSpeaker: To Be Announced\n\n
DTSTART:20200407T143000Z
DTEND:20200407T163000Z
LAST-MODIFIED:20200331T174735Z
LOCATION:
SUMMARY:Computer Science/Discrete Mathematics Seminar II
URL:https://www.ias.edu/node/100681
END:VEVENT
BEGIN:VEVENT
UID:ae4e54f9-c9e3-4769-b5a4-26ae298ddb57
DTSTAMP:20200401T030202Z
CREATED:20191021T183002Z
DESCRIPTION:Speaker: TBA\n\n
DTSTART:20200407T154500Z
DTEND:20200407T163000Z
LAST-MODIFIED:20191021T183002Z
LOCATION:*Princeton University\, 407 Jadwin Hall\, PCTS Seminar Room*
SUMMARY:PCTS Seminar Series: Deep Learning for Physics
URL:https://www.ias.edu/node/105776
END:VEVENT
BEGIN:VEVENT
UID:00ce1f09-9370-4dfd-af2d-67b46a8519a1
DTSTAMP:20200401T030202Z
CREATED:20191021T183002Z
DESCRIPTION:Topic: TBA\n\nSpeaker: TBA\n\n
DTSTART:20200407T180000Z
DTEND:20200407T190000Z
LAST-MODIFIED:20191021T183002Z
LOCATION:*Princeton University\, 407 Jadwin Hall\, PCTS Seminar Room*
SUMMARY:PCTS Seminar Series: Deep Learning for Physics
URL:https://www.ias.edu/node/105766
END:VEVENT
BEGIN:VEVENT
UID:240fcc6d-0d05-4647-964e-d47847683797
DTSTAMP:20200401T030202Z
CREATED:20191104T190001Z
DESCRIPTION:
DTSTART:20200410T220000Z
DTEND:20200410T220000Z
LAST-MODIFIED:20200110T165016Z
LOCATION:
SUMMARY:IAS School of Mathematics Term II Ends
URL:https://www.ias.edu/node/106181
END:VEVENT
BEGIN:VEVENT
UID:f3553e45-c109-44bd-9b51-2d7c1eb249d4
DTSTAMP:20200401T030202Z
CREATED:20200303T183841Z
DESCRIPTION:Speaker: Stefano Ermon\, Stanford University\n\n
DTSTART:20200415T133000Z
DTEND:20200415T141500Z
LAST-MODIFIED:20200303T183841Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109876
END:VEVENT
BEGIN:VEVENT
UID:4c6f3d36-4d10-4563-8c40-d21fb7e39c4c
DTSTAMP:20200401T030202Z
CREATED:20200303T184115Z
DESCRIPTION:Speaker: Been Kim\, Google Brain\n\n
DTSTART:20200415T141500Z
DTEND:20200415T150000Z
LAST-MODIFIED:20200303T184115Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109881
END:VEVENT
BEGIN:VEVENT
UID:8be117b5-d629-426a-b185-abb79802515d
DTSTAMP:20200401T030202Z
CREATED:20200303T184233Z
DESCRIPTION:Speaker: Coffee break\n\n
DTSTART:20200415T150000Z
DTEND:20200415T153000Z
LAST-MODIFIED:20200303T184233Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109886
END:VEVENT
BEGIN:VEVENT
UID:44ec4343-a87f-4a8d-a777-b2ec6a4281c2
DTSTAMP:20200401T030202Z
CREATED:20200303T184406Z
DESCRIPTION:Speaker: Bin Yu\, UC Berkeley\n\n
DTSTART:20200415T153000Z
DTEND:20200415T161500Z
LAST-MODIFIED:20200303T184406Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109891
END:VEVENT
BEGIN:VEVENT
UID:131da05b-40e8-489a-86b0-0e913f2e82a4
DTSTAMP:20200401T030202Z
CREATED:20200303T184520Z
DESCRIPTION:Speaker: Spotlight talks\n\n
DTSTART:20200415T161500Z
DTEND:20200415T170000Z
LAST-MODIFIED:20200303T184520Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109896
END:VEVENT
BEGIN:VEVENT
UID:8dad0853-7401-422c-92a7-7fda6431f30a
DTSTAMP:20200401T030202Z
CREATED:20200303T184623Z
DESCRIPTION:
DTSTART:20200415T170000Z
DTEND:20200415T183000Z
LAST-MODIFIED:20200303T184623Z
LOCATION:Simons Hall Dining Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109901
END:VEVENT
BEGIN:VEVENT
UID:8085eb89-3cf1-4e94-ac83-fae139c581af
DTSTAMP:20200401T030202Z
CREATED:20200303T184746Z
DESCRIPTION:Speaker: John P Cunningham\, Columbia University\n\n
DTSTART:20200415T183000Z
DTEND:20200415T191500Z
LAST-MODIFIED:20200303T184746Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109906
END:VEVENT
BEGIN:VEVENT
UID:eb8dc818-c7e1-4773-854d-d4cf8765719d
DTSTAMP:20200401T030202Z
CREATED:20200303T184848Z
DESCRIPTION:Speaker: Tea break\n\n
DTSTART:20200415T191500Z
DTEND:20200415T194500Z
LAST-MODIFIED:20200303T184848Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109911
END:VEVENT
BEGIN:VEVENT
UID:837242af-8276-40a3-a48b-bf5ffa674256
DTSTAMP:20200401T030202Z
CREATED:20200303T184951Z
DESCRIPTION:Speaker: Richard Zemel\, Member\, School of Mathematics\n\n
DTSTART:20200415T194500Z
DTEND:20200415T201500Z
LAST-MODIFIED:20200304T131008Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109916
END:VEVENT
BEGIN:VEVENT
UID:e2c72c21-840a-4091-ae0e-3348d18a4276
DTSTAMP:20200401T030202Z
CREATED:20200303T185115Z
DESCRIPTION:Speaker: Panel Discussion\n\n
DTSTART:20200415T201500Z
DTEND:20200415T211500Z
LAST-MODIFIED:20200303T185115Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109921
END:VEVENT
BEGIN:VEVENT
UID:0022816a-fd00-4a3a-9a8d-84f8367a3201
DTSTAMP:20200401T030202Z
CREATED:20200303T185247Z
DESCRIPTION:Speaker: Joshua Tenenbaum\, MIT\n\n
DTSTART:20200416T133000Z
DTEND:20200416T141500Z
LAST-MODIFIED:20200303T185247Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109926
END:VEVENT
BEGIN:VEVENT
UID:38639779-98e4-4202-9929-f9af31f831bf
DTSTAMP:20200401T030202Z
CREATED:20200303T185514Z
DESCRIPTION:Speaker: Anirudh Goyal\, University of Montreal\n\n
DTSTART:20200416T141500Z
DTEND:20200416T150000Z
LAST-MODIFIED:20200303T185514Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109936
END:VEVENT
BEGIN:VEVENT
UID:6e744839-c7b3-48aa-928c-bfa1f4892450
DTSTAMP:20200401T030202Z
CREATED:20200303T185635Z
DESCRIPTION:Speaker: Coffee break\n\n
DTSTART:20200416T150000Z
DTEND:20200416T153000Z
LAST-MODIFIED:20200303T185635Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109946
END:VEVENT
BEGIN:VEVENT
UID:2ce183a1-1eb6-4430-a468-1af3f5e3a5dc
DTSTAMP:20200401T030202Z
CREATED:20200303T185742Z
DESCRIPTION:Speaker: Cynthia Rudin\, Duke University\n\n
DTSTART:20200416T153000Z
DTEND:20200416T161500Z
LAST-MODIFIED:20200303T185742Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109956
END:VEVENT
BEGIN:VEVENT
UID:42a64f51-6996-4472-a1de-5b5452f69274
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:
DTSTART:20200416T160000Z
DTEND:20200416T173000Z
LAST-MODIFIED:20200331T174823Z
LOCATION:
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106021
END:VEVENT
BEGIN:VEVENT
UID:612f4731-4676-4992-a1f3-d4cf6191737e
DTSTAMP:20200401T030202Z
CREATED:20200303T185908Z
DESCRIPTION:Speaker: Spotlight talks\n\n
DTSTART:20200416T161500Z
DTEND:20200416T170000Z
LAST-MODIFIED:20200303T185908Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109966
END:VEVENT
BEGIN:VEVENT
UID:42c5b4bc-9f40-4d79-a2d3-e37f7c744da6
DTSTAMP:20200401T030202Z
CREATED:20200303T190012Z
DESCRIPTION:
DTSTART:20200416T170000Z
DTEND:20200416T183000Z
LAST-MODIFIED:20200303T190012Z
LOCATION:Simons Hall Dining Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109976
END:VEVENT
BEGIN:VEVENT
UID:737dbbc0-ce20-48dd-b6dd-a4aa1f9fa416
DTSTAMP:20200401T030202Z
CREATED:20200303T190132Z
DESCRIPTION:Speaker: David Blei\, Columbia University\n\nAbstract: A core p
roblem in statistics and machine learning is to approximate difficult-to-c
ompute probability distributions. This problem is especially important in
Bayesian statistics\, which frames all inference about unknown quantities
as a calculation about a conditional distribution. In this talk I review a
nd discuss innovations in variational inference (VI)\, a method a that app
roximates probability distributions through optimization. VI has been used
in myriad applications in machine learning and Bayesian statistics. It te
nds to be faster than more traditional methods\, such as Markov chain Mont
e Carlo sampling.After quickly reviewing the basics\, I will discuss some
recent research on VI. I first describe stochastic variational inference\,
an approximate inference algorithm for handling massive data sets\, and d
emonstrate its application to probabilistic topic models of millions of ar
ticles. Then I discuss black box variational inference\, a generic algorit
hm for approximating the posterior. Black box inference easily applies to
many models but requires minimal mathematical work to implement. I will de
monstrate black box inference on deep exponential families---a method for
Bayesian deep learning---and describe how it enables powerful tools for pr
obabilistic programming.
DTSTART:20200416T183000Z
DTEND:20200416T191500Z
LAST-MODIFIED:20200303T192117Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109986
END:VEVENT
BEGIN:VEVENT
UID:8bd9dd54-39ba-4cfb-8252-283cbee02da2
DTSTAMP:20200401T030202Z
CREATED:20200303T190353Z
DESCRIPTION:Speaker: Suchi Saria\, Johns Hopkins University\n\n
DTSTART:20200416T191500Z
DTEND:20200416T194500Z
LAST-MODIFIED:20200303T190353Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109996
END:VEVENT
BEGIN:VEVENT
UID:621f4488-5327-4cf6-af10-72b2fc6c142f
DTSTAMP:20200401T030202Z
CREATED:20200303T190553Z
DESCRIPTION:Speaker: Tea break\n\n
DTSTART:20200416T200000Z
DTEND:20200416T203000Z
LAST-MODIFIED:20200303T190748Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/110006
END:VEVENT
BEGIN:VEVENT
UID:bb0ab07f-fc69-4a45-aafd-4a37c329f458
DTSTAMP:20200401T030202Z
CREATED:20200303T190719Z
DESCRIPTION:Speaker: Spotlight talks and Poster session\n\n
DTSTART:20200416T203000Z
DTEND:20200416T213000Z
LAST-MODIFIED:20200303T190719Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/110011
END:VEVENT
BEGIN:VEVENT
UID:fe7c3f9c-6b72-4fcc-9cd7-5c91dbc62e4e
DTSTAMP:20200401T030202Z
CREATED:20200303T185326Z
DESCRIPTION:Speaker: Percy Liang\, Stanford University\n\n
DTSTART:20200417T133000Z
DTEND:20200417T141500Z
LAST-MODIFIED:20200303T185326Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109931
END:VEVENT
BEGIN:VEVENT
UID:407da070-ccb0-4c6c-8fef-61a253273ff0
DTSTAMP:20200401T030202Z
CREATED:20200303T185558Z
DESCRIPTION:Speaker: Roger Grosse\, Member\, School of Mathematics\n\n
DTSTART:20200417T141500Z
DTEND:20200417T150000Z
LAST-MODIFIED:20200303T185558Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109941
END:VEVENT
BEGIN:VEVENT
UID:409a1a20-e48b-4a8b-b789-b20fbd8d0511
DTSTAMP:20200401T030202Z
CREATED:20200303T185654Z
DESCRIPTION:Speaker: Coffee break\n\n
DTSTART:20200417T150000Z
DTEND:20200417T153000Z
LAST-MODIFIED:20200303T185654Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109951
END:VEVENT
BEGIN:VEVENT
UID:92b400b8-e682-4ef5-81e7-ea72dcd29947
DTSTAMP:20200401T030202Z
CREATED:20200303T185821Z
DESCRIPTION:Speaker: Zico Kolter\, Carnegie Mellon University\n\n
DTSTART:20200417T153000Z
DTEND:20200417T161500Z
LAST-MODIFIED:20200303T185821Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109961
END:VEVENT
BEGIN:VEVENT
UID:ffec2469-2f6d-41cb-994a-c2592c10a531
DTSTAMP:20200401T030202Z
CREATED:20200303T185938Z
DESCRIPTION:Speaker: Poster session\n\n
DTSTART:20200417T161500Z
DTEND:20200417T170000Z
LAST-MODIFIED:20200303T185938Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109971
END:VEVENT
BEGIN:VEVENT
UID:874428a4-238f-4151-825d-235383e627e4
DTSTAMP:20200401T030202Z
CREATED:20200303T190030Z
DESCRIPTION:
DTSTART:20200417T170000Z
DTEND:20200417T183000Z
LAST-MODIFIED:20200303T190030Z
LOCATION:Simons Hall Dining Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109981
END:VEVENT
BEGIN:VEVENT
UID:07aa8c97-e939-4f1c-92da-d5285e446aed
DTSTAMP:20200401T030202Z
CREATED:20200303T190224Z
DESCRIPTION:Speaker: Chelsea Finn\, Stanford University and Google AI\n\nAb
stract: Despite the success of deep learning\, much of its success has exi
sted in settings where the goal is to learn one\, single-purpose function
from data. However\, in many contexts\, we hope to optimize neural network
s for multiple\, distinct tasks (i.e. multi-task learning)\, and optimize
so that what is learned from these tasks is transferable to the acquisitio
n of new tasks (e.g. as in meta-learning). In this talk\, I will discuss h
ow the multi-task and meta optimization problems differ from standard prob
lems\, and some of the unique challenges that arise\, both in optimization
and in regularization. This includes (1) a kind of overfitting that is un
ique to meta-learning\, where the optimizer memorizes not the labels\, but
the functions that solve the training tasks\; and (2) challenging optimiz
ation landscapes that are common in multi-task learning settings. In both
cases\, I will present concrete characterizations of the underlying proble
ms\, and steps we can take to mitigate them. By taking these steps\, we ob
serve substantial gains in practice.
DTSTART:20200417T183000Z
DTEND:20200417T191500Z
LAST-MODIFIED:20200303T191944Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/109991
END:VEVENT
BEGIN:VEVENT
UID:b553a4cb-e090-4f56-8db6-ec04ff888334
DTSTAMP:20200401T030202Z
CREATED:20200303T190427Z
DESCRIPTION:Speaker: Spotlight talks\n\n
DTSTART:20200417T191500Z
DTEND:20200417T194500Z
LAST-MODIFIED:20200303T190427Z
LOCATION:Wolfensohn Hall
SUMMARY:Workshop on New Directions in Optimization\, Statistics and Machine
Learning
URL:https://www.ias.edu/node/110001
END:VEVENT
BEGIN:VEVENT
UID:9b10d03e-9eed-4170-aadc-560a3f8143cf
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:
DTSTART:20200423T160000Z
DTEND:20200423T173000Z
LAST-MODIFIED:20200331T174912Z
LOCATION:
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106026
END:VEVENT
BEGIN:VEVENT
UID:35f9c2e5-6ccd-4d38-aa37-05916c542d4b
DTSTAMP:20200401T030202Z
CREATED:20200219T150300Z
DESCRIPTION:Topic: To Be Announced\n\nSpeaker: Kobbi Nissim \, Georgetown U
niversity\n\n
DTSTART:20200427T150000Z
DTEND:20200427T160000Z
LAST-MODIFIED:20200331T174700Z
LOCATION:
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/109426
END:VEVENT
BEGIN:VEVENT
UID:abb5e9bb-f3d9-49bc-87ce-cfff5d987122
DTSTAMP:20200401T030202Z
CREATED:20191025T210001Z
DESCRIPTION:
DTSTART:20200430T160000Z
DTEND:20200430T173000Z
LAST-MODIFIED:20200331T174946Z
LOCATION:
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106031
END:VEVENT
BEGIN:VEVENT
UID:1c5f9af1-ed31-424c-8bb4-e2e308d9d137
DTSTAMP:20200401T030202Z
CREATED:20200106T152004Z
DESCRIPTION:Topic: To Be Announced\n\nSpeaker: Adebisi Agboola\, University
of California\, Santa Barbara\n\n
DTSTART:20200430T203000Z
DTEND:20200430T213000Z
LAST-MODIFIED:20200331T175022Z
LOCATION:
SUMMARY:Joint IAS/Princeton University Number Theory Seminar
URL:https://www.ias.edu/node/107416
END:VEVENT
BEGIN:VEVENT
UID:429dc49b-83ff-4251-ad15-e3914fac083a
DTSTAMP:20200401T030202Z
CREATED:20191025T211502Z
DESCRIPTION:
DTSTART:20200507T160000Z
DTEND:20200507T173000Z
LAST-MODIFIED:20200331T175057Z
LOCATION:
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106036
END:VEVENT
BEGIN:VEVENT
UID:e0337f5d-9fbb-4c91-972e-f0b765774a24
DTSTAMP:20200401T030202Z
CREATED:20200124T160538Z
DESCRIPTION:Topic: Using discrepancy theory to improve the design of random
ized controlled trials\n\nSpeaker: Dan Spielman\, Yale University\n\n
DTSTART:20200511T150000Z
DTEND:20200511T160000Z
LAST-MODIFIED:20200331T174534Z
LOCATION:
SUMMARY:Computer Science/Discrete Mathematics Seminar I
URL:https://www.ias.edu/node/108111
END:VEVENT
BEGIN:VEVENT
UID:9d32952a-3fdd-47ac-a360-62cbcce6138c
DTSTAMP:20200401T030202Z
CREATED:20200211T163248Z
DESCRIPTION:Speaker: Silvia Ghinassi\, Member\, School of Mathematics\n\n
DTSTART:20200512T180000Z
DTEND:20200512T193000Z
LAST-MODIFIED:20200331T174413Z
LOCATION:
SUMMARY:Special Members' Seminar in Honor of Maryam Mirzakhani
URL:https://www.ias.edu/node/109186
END:VEVENT
BEGIN:VEVENT
UID:ecd72532-a722-4a9b-a459-adcb0a78ba37
DTSTAMP:20200401T030202Z
CREATED:20191025T211502Z
DESCRIPTION:
DTSTART:20200514T160000Z
DTEND:20200514T173000Z
LAST-MODIFIED:20200331T175128Z
LOCATION:
SUMMARY:Seminar on Theoretical Machine Learning
URL:https://www.ias.edu/node/106041
END:VEVENT
BEGIN:VEVENT
UID:5b1e4e60-a1c6-42c8-aa85-54d03b8cf7b2
DTSTAMP:20200401T030202Z
CREATED:20200110T165145Z
DESCRIPTION:
DTSTART:20200921T130000Z
DTEND:20200921T130000Z
LAST-MODIFIED:20200110T165145Z
LOCATION:
SUMMARY:IAS School of Mathematics Term I Begins
URL:https://www.ias.edu/node/107721
END:VEVENT
BEGIN:VEVENT
UID:71074d21-b16e-4494-9b22-8c4e470834c5
DTSTAMP:20200401T030202Z
CREATED:20200110T165522Z
DESCRIPTION:
DTSTART:20201218T230000Z
DTEND:20201218T230000Z
LAST-MODIFIED:20200110T165522Z
LOCATION:
SUMMARY:IAS School of Mathematics Term I Ends
URL:https://www.ias.edu/node/107731
END:VEVENT
BEGIN:VEVENT
UID:703aaa9c-222f-4341-8d7a-81fdf3eb6c10
DTSTAMP:20200401T030202Z
CREATED:20200110T165737Z
DESCRIPTION:
DTSTART:20210111T140000Z
DTEND:20210111T140000Z
LAST-MODIFIED:20200110T165737Z
LOCATION:
SUMMARY:IAS School of Mathematics Term II Begins
URL:https://www.ias.edu/node/107736
END:VEVENT
END:VCALENDAR