Computer Science & Discrete Mathematics (CSDM)

Computer Science & Discrete Mathematics (CSDM) Seminar

A weekly seminar on topics in theoretical computer science and discrete mathematics

Time: Every Monday 11:00 AM-12:00 PM, and Tuesday 10:30 AM-12:30 PM,   Place: Simonyi 101

Information about CSDM

Upcoming Talk

On SNARGs for NP and Nullstellensatz Proofs

Speaker: Alex Lombardi, Princeton University
When: Monday, March 9, 2026 | 11:00 AM EDT
Where: West Lecture Hall and Remote Access

Abstract

We revisit the question of whether it is possible to build succinct non-interactive arguments (SNARGs) for all of NP under standard cryptographic assumptions. In particular, we give a candidate non-adaptive SNARG for NP and prove its soundness under

  • the learning with errors assumption (or other standard assumptions such as bilinear maps), and 
  • a mathematical conjecture about multivariate polynomials over the reals. In more detail, our conjecture is an upper bound on the minimum total coefficient size of Nullstellensatz proofs (Potechin-Zhang, ICALP 2024) of membership in a concrete polynomial ideal.

In this talk, we will discuss the construction, security analysis, and our current understanding of the conjecture. 

Based on joint work with Lali Devadas, Sam Hopkins, Yael Kalai, Pravesh Kothari, and Surya Mathialagan.

Add to calendar 03/09/2026 11:0003/09/2026 12:00America/New_YorkComputer Science/Discrete Mathematics Seminar Iuse-titleTopic: On SNARGs for NP and Nullstellensatz Proofs Speakers: Alex Lombardi, Princeton University More: https://www.ias.edu/math/events/computer-sciencediscrete-mathematics-seminar-i-618 We revisit the question of whether it is possible to build succinct non-interactive arguments (SNARGs) for all of NP under standard cryptographic assumptions. In particular, we give a candidate non-adaptive SNARG for NP and prove its soundness under * the learning with errors assumption (or other standard assumptions such as bilinear maps), and  * a mathematical conjecture about multivariate polynomials over the reals. In more detail, our conjecture is an upper bound on the minimum total coefficient size of Nullstellensatz proofs (Potechin-Zhang, ICALP 2024) of membership in a concrete polynomial ideal. In this talk, we will discuss the construction, security analysis, and our current understanding of the conjecture.  Based on joint work with Lali Devadas, Sam Hopkins, Yael Kalai, Pravesh Kothari, and Surya Mathialagan. West Lecture Hall and Remote Accessa7a99c3d46944b65a08073518d638c23

Upcoming Schedule

Tuesday, Mar 10, 2026 | 10:30am
Hanlin Ren, Institute for Advanced Study
Reverse Mathematics of Complexity Lower Bounds, Part I
Abstract

Why is it so hard to prove P != NP, or even to prove super-linear circuit lower bounds? While we often blame a lack of combinatorial ingenuity, the bottleneck might be more fundamental: the logical strength of our mathematical tools.

This series of talks explores the "Reverse Mathematics" of complexity theory. Instead of asking what lower bounds we can prove, we ask: What axioms are *strictly necessary* to prove certain lower bounds? By re-examining established lower bounds through the lens of feasible mathematics, we find that many complexity lower bounds are actually basic combinatorial principles in disguise.

We will focus on two intriguing phenomena:

  1. Lower bounds as basic principles in disguise: We revisit Maass's classical lower bound (STOC'84) that one-tape Turing machines require quadratic time to recognize Palindromes. We show that this lower bound is logically equivalent to the weak pigeonhole principle over a base theory of feasible reasoning ("PV").
  2. The minimum theory for hardness: Proving lower bounds for the Resolution proof system requires a specific threshold of logical strength ("T^1_2 + dwPHP(PV)", or, "rwPHP(PLS)"). Remarkably, this requirement is intrinsic to the Resolution system itself, independent of which specific hard tautology we are analyzing.

In this series of two talks, I will introduce the framework of reverse mathematics for complexity lower bounds and present results illustrating the two phenomena above. No prior knowledge about bounded arithmetic is assumed.

These talks are based on the following papers:

  • Oliver Korten. Derandomization from Time-Space Tradeoffs. CCC'22
  • Lijie Chen, Jiatu Li, and Igor Oliveira. Reverse Mathematics of Complexity Lower Bounds. FOCS'24
  • Jiawei Li, Yuhao Li, and Hanlin Ren. Finding Bugs in Short Proofs: The Metamathematics of Resolution Lower Bounds. STOC'26

 

Add to calendar Tuesday, 2026-03-10 10:30Tuesday, 2026-03-10 12:30America/New_YorkComputer Science/Discrete Mathematics Seminar IIuse-titleTopic: Reverse Mathematics of Complexity Lower Bounds, Part I Speakers: Hanlin Ren, Institute for Advanced Study More: https://www.ias.edu/math/events/computer-sciencediscrete-mathematics-seminar-ii-615 Why is it so hard to prove P != NP, or even to prove super-linear circuit lower bounds? While we often blame a lack of combinatorial ingenuity, the bottleneck might be more fundamental: the logical strength of our mathematical tools. This series of talks explores the "Reverse Mathematics" of complexity theory. Instead of asking what lower bounds we can prove, we ask: What axioms are *strictly necessary* to prove certain lower bounds? By re-examining established lower bounds through the lens of feasible mathematics, we find that many complexity lower bounds are actually basic combinatorial principles in disguise. We will focus on two intriguing phenomena: * Lower bounds as basic principles in disguise: We revisit Maass's classical lower bound (STOC'84) that one-tape Turing machines require quadratic time to recognize Palindromes. We show that this lower bound is logically equivalent to the weak pigeonhole principle over a base theory of feasible reasoning ("PV"). * The minimum theory for hardness: Proving lower bounds for the Resolution proof system requires a specific threshold of logical strength ("T^1_2 + dwPHP(PV)", or, "rwPHP(PLS)"). Remarkably, this requirement is intrinsic to the Resolution system itself, independent of which specific hard tautology we are analyzing. In this series of two talks, I will introduce the framework of reverse mathematics for complexity lower bounds and present results illustrating the two phenomena above. No prior knowledge about bounded arithmetic is assumed. These talks are based on the following papers: * Oliver Korten. Derandomization from Time-Space Tradeoffs. CCC'22 * Lijie Chen, Jiatu Li, and Igor Oliveira. Reverse Mathematics of Complexity Lower Bounds. FOCS'24 * Jiawei Li, Yuhao Li, and Hanlin Ren. Finding Bugs in…Dilworth Rooma7a99c3d46944b65a08073518d638c23
Monday, Mar 16, 2026 | 11:00am
Nikhil Shagrithaya, University of Michigan
Probabilistic Guarantees to Explicit Constructions: Local Properties of Linear Codes
Abstract

We present a general framework for derandomizing random linear codes with respect to a broad class of properties, known as local properties, which encompass several standard notions such as minimum distance, list-decoding, list-recovery, and perfect hashing. Our approach extends the classical Alon-Edmonds-Luby (AEL) construction through a modified formalism of local coordinate-wise linear (LCL) properties, introduced by Levi, Mosheiff, and Shagrithaya (FOCS 2025).

The main result shows that we can obtain explicit constructions of codes achieving LCL properties at the same level of optimality as random linear codes, with the cost of increased alphabet size. This gives the first explicit constructions of list-recoverable codes having output list sizes that match those of random linear codes, among other results.

In this talk we will give the necessary background required to formulate the main result, as well as a sketch of its proof.

Based on joint work with Fernando Granha Jeronimo.

Add to calendar Monday, 2026-03-16 11:00Monday, 2026-03-16 12:00America/New_YorkComputer Science/Discrete Mathematics Seminar Iuse-titleTopic: Probabilistic Guarantees to Explicit Constructions: Local Properties of Linear Codes Speakers: Nikhil Shagrithaya, University of Michigan More: https://www.ias.edu/math/events/computer-sciencediscrete-mathematics-seminar-i-619 We present a general framework for derandomizing random linear codes with respect to a broad class of properties, known as local properties, which encompass several standard notions such as minimum distance, list-decoding, list-recovery, and perfect hashing. Our approach extends the classical Alon-Edmonds-Luby (AEL) construction through a modified formalism of local coordinate-wise linear (LCL) properties, introduced by Levi, Mosheiff, and Shagrithaya (FOCS 2025). The main result shows that we can obtain explicit constructions of codes achieving LCL properties at the same level of optimality as random linear codes, with the cost of increased alphabet size. This gives the first explicit constructions of list-recoverable codes having output list sizes that match those of random linear codes, among other results. In this talk we will give the necessary background required to formulate the main result, as well as a sketch of its proof. Based on joint work with Fernando Granha Jeronimo. Simonyi Hall 101 and Remote Accessa7a99c3d46944b65a08073518d638c23
Tuesday, Mar 17, 2026 | 10:30am
Hanlin Ren, Institute for Advanced Study
Reverse Mathematics of Complexity Lower Bounds, Part II
Abstract

Why is it so hard to prove P != NP, or even to prove super-linear circuit lower bounds? While we often blame a lack of combinatorial ingenuity, the bottleneck might be more fundamental: the logical strength of our mathematical tools.

This series of talks explores the "Reverse Mathematics" of complexity theory. Instead of asking what lower bounds we can prove, we ask: What axioms are *strictly necessary* to prove certain lower bounds? By re-examining established lower bounds through the lens of feasible mathematics, we find that many complexity lower bounds are actually basic combinatorial principles in disguise.

We will focus on two intriguing phenomena:

  1. Lower bounds as basic principles in disguise: We revisit Maass's classical lower bound (STOC'84) that one-tape Turing machines require quadratic time to recognize Palindromes. We show that this lower bound is logically equivalent to the weak pigeonhole principle over a base theory of feasible reasoning ("PV").
  2. The minimum theory for hardness: Proving lower bounds for the Resolution proof system requires a specific threshold of logical strength ("T^1_2 + dwPHP(PV)", or, "rwPHP(PLS)"). Remarkably, this requirement is intrinsic to the Resolution system itself, independent of which specific hard tautology we are analyzing.

In this series of two talks, I will introduce the framework of reverse mathematics for complexity lower bounds and present results illustrating the two phenomena above. No prior knowledge about bounded arithmetic is assumed.

These talks are based on the following papers:

  • Oliver Korten. Derandomization from Time-Space Tradeoffs. CCC'22
  • Lijie Chen, Jiatu Li, and Igor Oliveira. Reverse Mathematics of Complexity Lower Bounds. FOCS'24
  • Jiawei Li, Yuhao Li, and Hanlin Ren. Finding Bugs in Short Proofs: The Metamathematics of Resolution Lower Bounds. STOC'26
Add to calendar Tuesday, 2026-03-17 10:30Tuesday, 2026-03-17 12:30America/New_YorkComputer Science/Discrete Mathematics Seminar IIuse-titleTopic: Reverse Mathematics of Complexity Lower Bounds, Part II Speakers: Hanlin Ren, Institute for Advanced Study More: https://www.ias.edu/math/events/computer-sciencediscrete-mathematics-seminar-ii-616 Why is it so hard to prove P != NP, or even to prove super-linear circuit lower bounds? While we often blame a lack of combinatorial ingenuity, the bottleneck might be more fundamental: the logical strength of our mathematical tools. This series of talks explores the "Reverse Mathematics" of complexity theory. Instead of asking what lower bounds we can prove, we ask: What axioms are *strictly necessary* to prove certain lower bounds? By re-examining established lower bounds through the lens of feasible mathematics, we find that many complexity lower bounds are actually basic combinatorial principles in disguise. We will focus on two intriguing phenomena: * Lower bounds as basic principles in disguise: We revisit Maass's classical lower bound (STOC'84) that one-tape Turing machines require quadratic time to recognize Palindromes. We show that this lower bound is logically equivalent to the weak pigeonhole principle over a base theory of feasible reasoning ("PV"). * The minimum theory for hardness: Proving lower bounds for the Resolution proof system requires a specific threshold of logical strength ("T^1_2 + dwPHP(PV)", or, "rwPHP(PLS)"). Remarkably, this requirement is intrinsic to the Resolution system itself, independent of which specific hard tautology we are analyzing. In this series of two talks, I will introduce the framework of reverse mathematics for complexity lower bounds and present results illustrating the two phenomena above. No prior knowledge about bounded arithmetic is assumed. These talks are based on the following papers: * Oliver Korten. Derandomization from Time-Space Tradeoffs. CCC'22 * Lijie Chen, Jiatu Li, and Igor Oliveira. Reverse Mathematics of Complexity Lower Bounds. FOCS'24 * Jiawei Li, Yuhao Li, and Hanlin Ren. Finding Bugs in…Simonyi 101 and Remote Accessa7a99c3d46944b65a08073518d638c23

Past Seminars Archive

Past Seminars Archive