Hardness of Easy Problems and Fine-Grained Complexity

In recent years, a new “fine-grained” theory of computational hardness has been developed, based on “fine-grained reductions” that focus on exact running times for problems. 

 

We follow the fashion of NP-hardness in a more delicate manner. We identify key problems for which we have a time t(n) solution, but believe that there is no t(n)(1−eps) solution for any eps greater than 0. Then, we reduce the key problems in a fine-grained way to many important problems. Thus, getting a tight conditional lower-bounds for them.

 

This approach has led to the discovery of many meaningful relationships between problems, and to equivalence classes. One such key conjecture is a quantitative strengthening of the P!=NP conjecture called SETH. 

 

Research on SETH-based  lower bounds has flourished in particular in recent years. Showing, for example, that the classical algorithms are likely optimal for problems such as Diameter, Sub-tree Isomorphism, Edit Distance and Longest Common Subsequence. 

 

In the first half of the seminar we will introduce and survey the classic conjectures and tools of these fine-grained reductions.

 

In the second half of it we will cover a recent technique (joint work with Abboud, Bringmann and Khoury) that translates many of these conditional-lower bounds to hold also for approximation algorithms. For example, we show that no distance oracle with O(1)-approximation, m(1+o(1)) preprocessing time and mo(1) query time is likely to exist.

Date

Affiliation

Member, School of Mathematics