IAS Physics Group Meeting

The Principles of Deep Learning Theory

This group meeting will be presented in-person in Wolfensohn Hall at the IAS.

For contact-tracing purposes all off-campus attendees must register for this seminar: 
REGISTRATION FORM

All in-person attendees must be fully vaccinated and masks are required in all indoor spaces. Additionally, all off-campus attendees are required to upload proof of vaccination via the IAS CrowdPass App 

Abstract: Deep learning is an exciting approach to modern artificial intelligence based on artificial neural networks. The goal of this talk is to provide a blueprint — using tools from physics — for theoretically analyzing deep neural networks of practical relevance. This task will encompass both understanding the statistics of initialized deep networks and determining the training dynamics of such an ensemble when learning from data.

In terms of their "microscopic" definition, deep neural networks are a flexible set of functions built out of many basic computational blocks called neurons, with many neurons in parallel organized into sequential layers. Borrowing from the effective theory framework, we will develop a perturbative 1/n expansion around the limit of an infinite number of neurons per layer and systematically integrate out the parameters of the network. We will explain how the network simplifies at large width and how the propagation of signals from layer to layer can be understood in terms of a Wilsonian renormalization group flow. This will make manifest that deep networks have a tuning problem, analogous to criticality, that needs to be solved in order to make them useful. Ultimately we will find a "macroscopic" description for wide and deep networks in terms of weakly-interacting statistical models, with the strength of the interactions between the neurons growing with depth-to-width aspect ratio of the network. Time permitting, we will explain how the interactions induce representation learning.

https://arxiv.org/abs/2104.00008 (a short essay)
https://arxiv.org/abs/2106.10165 (a long book, on which the talk will provide an overview)

 

This talk is based on a book, "The Principles of Deep Learning Theory," co-authored with Sho Yaida and based on research also in collaboration with Boris Hanin. It will be published next year by Cambridge University Press.

Date & Time

October 20, 2021 | 1:45pm – 3:00pm

Location

Wolfensohn Hall (behind Bloomberg Hall)

Speakers

Dan Roberts

Speaker Affiliation

MIT & Salesforce

Categories

Tags