Differential Privacy Symposium

About the Symposium

Differential privacy disentangles learning about a dataset as a whole from learning about an individual data contributor. Just now entering practice on a global scale, the demand for advanced differential privacy techniques and knowledge of basic skills is pressing. This symposium will provide an in-depth look at the current context for privacy-preserving statistical data analysis and an agenda for future research. This event is organized by Cynthia Dwork, of Microsoft Research, with support from the Alfred P. Sloan Foundation.

 

Schedule

All talks took place in Wolfensohn Hall.

8:30 am                Registration and Continental Breakfast, Fuld Hall Common Room

9:30 am                Welcome: Robbert Dijkgraaf, IAS Director and Leon Levy Professor

9:35 am                Introduction: The Definition of Differential Privacy - Cynthia Dwork, Microsoft Research

9:45 am                Composition: The Key to Differential Privacy’s Success - Guy Rothblum, Weizmann Institute of Science

10:45 am              Coffee Break, Lobby of Wolfensohn Hall

11:15 am              Differentially Private Algorithms: Some Primitives and Paradigms - Kunal Talwar, Google Brain

12:15 pm              Lunch, Simons Hall

1:45 pm                Dusting for Fingerprints in Private Data - Jonathan Ullman, Northeastern University

2:45 pm                Afternoon Tea

3:15 pm                Differential Privacy in Context: Conceptual and Ethical Considerations - Helen Nissenbaum, Cornell Tech and New York University

4:15 pm                Rigorous Data Dredging: Theory and Tools for Adaptive Data Analysis - Aaron Roth, The University of Pennsylvania

5:30 pm                Reception, Fuld Hall Common Room

Talk Descriptions

Composition: The Key to Differential Privacy is Success by Guy Rothblum, The Weizmann Institute of Science

Differential privacy provides a rigorous and robust privacy guarantee to individuals in the context of statistical data analysis. It has sparked a revolution in the study of privacy-preserving data analysis, impacting fields from computer science to statistics to legal scholarship. A key factor underlying this success robustness under composition: when multiple differentially private algorithms are run on the same individual's data, privacy degrades smoothly and gradually. 

It is hard to overstate the importance of robustness under composition. In reality, individuals' data are involved in multiple datasets and analyses. Privacy that does not compose offers only questionable protection. No less important, composition makes Differential Privacy programmable: differentially private algorithms for small or common tasks can be used as subroutines in larger more complex algorithms, and inherit the subroutines' privacy guarantees (up to some degradation).

Composition has been key to differential privacy's success, and understanding how privacy degrades under composition is a core issue in the study of privacy-preserving data analysis. The differential privacy community has attempted to achieve a more perfect understanding of composition. Recent works have made significant advances in this study, touching on issues in complexity theory, probability theory, and shedding new light on the behavior differentially private algorithms.

This talk will survey the study of composition in differentially private data analysis, from basic definitions and guarantees and all the way to the most recent developments and open questions in this exciting frontier.


Differentially Private Algorithms: Some Primitives and Paradigms by Kunal Talwar, Google Brain

How does one design a differentially private algorithm? Much like the analogous question without the privacy requirement, this question has no single answer. Nevertheless, there are some general-purpose approaches that have been broadly successful in the design of DP algorithms. Kunal Talwar will start by describing some algorithmic primitives that allow us to get private answers to simple questions. Armed with these primitives and the composition properties of Differential Privacy, we can design algorithms for many complex tasks. Making such an algorithm resource-efficient usually takes some care and creativity, whether the resource is running time or privacy cost. He will then describe some of the general paradigms that have allowed high utility at a small privacy cost for several tasks. Talwar will illustrate how these apply to some common data analysis tasks.


Dusting for Fingerprints in Private Data by Jon Ullman, Northeastern University

We describe a family of attacks that recover sensitive information about individuals using only simple summary statistics computed on a dataset.  Notably our attacks succeed under minimal assumptions on the distribution of the data, even if the attacker has very limited information about this distribution, and even if the summary statistics are significantly distorted.  Our attacks build on and generalize the method of fingerprinting codes for proving lower bounds in differential privacy, and also extend the practical attacks on genomic datasets demonstrated by Homer et al.  Surprisingly, the amount of noise that our attacks can tolerate is nearly matched by the amount of noise required to achieve differential privacy, meaning that the robust privacy guarantees of differential privacy come at almost no cost in our model. 
Based on joint work with Cynthia Dwork, Adam Smith, Thomas Steinke, and Salil Vadhan.


Differential Privacy in Context: Conceptual and Ethical Considerations by Helen Nissenbaum, Cornell Tech and New York University

Critiques of differential privacy reveal important questions about how effective any purely technological approach can be in protecting and promoting privacy, as a societal value. Taking an ethical perspective, Nissenbaum will argue that while the absence of clearly articulated contextual models severely handicaps these critiques, this absence may also obscure the great potential that differential privacy holds for preserving privacy as a fundamental societal value.


Rigorous Data Dredging: Theory and Tools for Adaptive Data Analysis by Aaron Roth, University of Pennsylvania

In this talk, Aaron Roth will introduce and summarize a recent line of work connecting differentially private algorithm design to the seemingly unrelated problem of post-selection inference and hypothesis testing. Specifically, subject to the constraint that all data analyses are conducted via differentially private algorithms (a distributional stability condition), it is possible to construct valid confidence intervals around arbitrary linear functionals (and other "low sensitivity" test statistics) and to correct p-values for arbitrary hypothesis tests, even though they have been chosen as a function of the data set. Despite recent progress, unsurprisingly little is known about this problem, and there are a number of intriguing open directions. 

Organizer and Speaker Bios

Cynthia Dwork
Cynthia Dwork, Distinguished Scientist at Microsoft Research, is renowned for placing privacy-preserving data analysis on a mathematically rigorous foundation.  A cornerstone of this work is differential privacy, a strong privacy guarantee frequently permitting highly accurate data analysis. In recent years she has expanded her efforts in formalizing social concepts to the problem of achieving fairness in classification systems.  Dwork has also made seminal contributions in cryptography and distributed computing, and is a recipient of the Edsger W. Dijkstra Prize, recognizing some of her earliest work establishing the pillars on which every fault-tolerant system has been built for decades.  She is a member of the National Academy of Sciences and the National Academy of Engineering, and a Fellow of the American Academy of Arts and Sciences and the American Philosophical Society. In January, 2017, Dwork will join Harvard as the Gordon McKay Professor of Computer Science at the Harvard Paulson School of Engineering, the Radcliffe Alumnae Professor at the Radcliffe Institute for Advanced Studay, and Professor by Affiliation at Harvard Law School.

Guy Rothblum
Guy Rothblum is a faculty member in the Faculty of Mathematics and Computer Science at the Weizmann Institute of Science. He has wide interests in theoretical computer science, with a focus on cryptography, privacy-preserving data analysis and complexity theory. His research aims to promote a foundational understanding of computing under security, privacy, and reliability concerns. Dr. Rothblum completed his Ph.D. at MIT, where his advisor was Shafi Goldwasser, and his  M.Sc. at The Weizmann Institute of Science, where his advisor was Moni Naor.

Kunal Talwar
Kunal Talwar is a Research Scientist at Google since 2015. He graduated from UC Berkeley in 2004 and worked at Microsoft Research in Silicon Valley until 2014. He has made major contributions in Metric Embeddings, Algorithms, and Differential Privacy. In Differential Privacy, he co-developed the Exponential Mechanism that was the first private mechanism for non-numerical queries and established connections between Game theory and Differential Privacy. His work on the Geoemtry of Differential Privacy led to instance-optimal private algorithms for a large class of queries. It also led to a surprising connection between Discrepancy theory and Functional analysis, leading to the first non-trivial approximation algorithms to Hereditary Discrepancy, and to progress on the Tusnady problem.

Jon Ullman
Jon Ullman is an assistant professor in the College of Computer and Information Sciences at Northeastern University.  His work studies privacy using tools and perspectives from cryptography, machine learning, algorithms, and game theory. Prior to joining Northeastern, he completed his PhD at Harvard University, and was in the inaugural class of junior fellows in the Simons Society of Fellows.

Helen Nissenbaum
Helen Nissenbaum is a professor at Cornell Tech, on leave from New York University. Her books include Obfuscation: A User's Guide for Privacy and Protest, with Finn Brunton (MIT Press, 2015) and Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford, 2010). Grants from DARPA, NSF, and DHHS have supported her work on ethical and political dimensions of digital technologies.  Recipient of the 2014 Barwise Prize of the American Philosophical Association, Nissenbaum has contributed to privacy-enhancing software, including TrackMeNot and AdNauseam. She holds a Ph.D. in philosophy from Stanford University.

Aaron Roth
Aaron Roth is an associate professor at the University of Pennsylvania, interested in differential privacy, game theory, learning theory, and algorithmic fairness. Together with Cynthia Dwork, he is the author of "The Algorithmic Foundations of Differential Privacy". He is the winner of an NSF CAREER award, and Alfred P. Sloan Research Fellowship, and the Presidential Early Career Award for Scientists and Engineers (PECASE). 

Videos of Talks

Videos of talks from the Four Facets of Differential Privacy symposium are posted on the Institute’s YouTube channel: