Computer Science/Discrete Mathematics Seminar II

On the Communication Complexity of Classification Problems

We will discuss a model of distributed learning in the spirit of Yao's communication complexity model. We consider a two-party setting, where each of the players gets a list of labelled examples and they communicate in order to jointly perform a learning task. For example, consider the following problem of Convex Set Disjointness: In this instance Alice and Bob each receive a set of examples in Euclidean space and they need to decide if there exists a hyper-plane that separate the sets.

We study several fundamental questions in this model. For example, what are the learnable classes, and their respective sample complexity. This framework allows us to prove, in the context of distributed learning, unconditional separations between various learning contexts, like realizable versus agnostic learning, and proper versus improper learning. The proofs here are based on standard ideas from communication complexity as well as learning theory and geometric constructions in Euclidean space. As a corollary, we also obtain lower bounds that match the performance of algorithms from previous works on distributed classification.

Date & Time

February 27, 2018 | 10:30am – 12:30pm

Location

Simonyi Hall 101

Speakers

Roi Livni

Affiliation

Princeton University