Computer Science/Discrete Mathematics Seminar II

Interactive Channel Capacity

In a profoundly influential 1948 paper, Claude Shannon defined the entropy function \(H\), and showed that the capacity of a symmetric binary channel with noise rate (bit flip rate) \(\mathrm{eps}\) is \(1-H(\mathrm{eps})\). This means that one can reliably communicate \(n\) bits by sending roughly \(n / (1-H(\mathrm{eps}))\) bits over this channel. The extensive study of interactive communication protocols in the last decades gives rise to the related question of finding the capacity of a noisy channel when it is used interactively. We define interactive channel capacity as the minimal ratio between the communication required to compute a function (over a non-noisy channel), and the communication required to compute the same function over the \(\mathrm{eps}\)-noisy channel. We show that the interactive channel capacity is roughly \(1-\Theta\left( \sqrt{H(\mathrm{eps})} \right)\). Our result gives the first separation between interactive and non-interactive channel capacity. Joint work with Ran Raz.

Date & Time

November 19, 2013 | 10:30am – 12:30pm

Location

S-101

Affiliation

Member, School of Mathematics