Theoretical Machine Learning Seminar

Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity

This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the “over-fitting regime,” while the second one focuses on the statistical optimality of neural network classification in a student-teacher framework. This talk is concluded by proposing sparsity induced training of neural network with statistical guarantee.

Date & Time

November 13, 2019 | 12:00pm – 1:30pm

Location

Dilworth Room

Affiliation

Purdue University; Member, School of Mathematics