Video Lectures

Optimal Weak to Strong Learning

Kasper Green Larsen

The classic algorithm AdaBoost allows to convert a weak learner, that is an algorithm that produces a hypothesis which is slightly better than chance, into a strong learner, achieving arbitrarily high accuracy when given enough training data. We...

Diophantine approximation deals with quantitative and qualitative aspects of approximating numbers by rationals. A major breakthrough by Kleinbock and Margulis in 1998 was to study Diophantine approximations for manifolds using homogeneous dynamics...