Computational Complexity & Differential Privacy Salil Vadhan Harvard University Joint works with Cynthia Dwork, Kunal Talwar, Andrew McGregor, Ilya Mironov,

Slides:



Advertisements
Similar presentations
Merkle Puzzles Are Optimal
Advertisements

Computational Complexity & Differential Privacy Salil Vadhan Harvard University Joint works with Cynthia Dwork, Kunal Talwar, Andrew McGregor, Ilya Mironov,
Quantum Money from Hidden Subspaces Scott Aaronson and Paul Christiano.
I have a DREAM! (DiffeRentially privatE smArt Metering) Gergely Acs and Claude Castelluccia {gergely.acs, INRIA 2011.
Inaccessible Entropy Iftach Haitner Microsoft Research Omer Reingold Weizmann & Microsoft Hoeteck Wee Queens College, CUNY Salil Vadhan Harvard University.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
Computational Complexity & Differential Privacy Salil Vadhan Harvard University Joint works with Cynthia Dwork, Kunal Talwar, Andrew McGregor, Ilya Mironov,
Tight Bounds for Unconditional Authentication Protocols in the Moni Naor Gil Segev Adam Smith Weizmann Institute of Science Israel Modeland Shared KeyManual.
Foundations of Cryptography Lecture 5 Lecturer: Moni Naor.
By Claudia Fiorini, Enrico Martinelli, Fabio Massacci
Foundations of Privacy Lecture 6 Lecturer: Moni Naor.
Privacy Enhancing Technologies
The State of the Art Cynthia Dwork, Microsoft Research TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A AA A AAA.
TAMPER DETECTION AND NON-MALLEABLE CODES Daniel Wichs (Northeastern U)
Seminar in Foundations of Privacy 1.Adding Consistency to Differential Privacy 2.Attacks on Anonymized Social Networks Inbal Talgam March 2008.
On the (Im)Possibility of Key Dependent Encryption Iftach Haitner Microsoft Research TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Kunal Talwar MSR SVC [Dwork, McSherry, Talwar, STOC 2007] TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A AA A.
A Parallel Repetition Theorem for Any Interactive Argument Iftach Haitner Microsoft Research TexPoint fonts used in EMF. Read the TexPoint manual before.
Differential Privacy 18739A: Foundations of Security and Privacy Anupam Datta Fall 2009.
Co-operative Private Equality Test(CPET) Ronghua Li and Chuan-Kun Wu (received June 21, 2005; revised and accepted July 4, 2005) International Journal.
Sketching in Adversarial Environments Or Sublinearity and Cryptography 1 Moni Naor Joint work with: Ilya Mironov and Gil Segev.
CMSC 414 Computer and Network Security Lecture 9 Jonathan Katz.
Privacy without Noise Yitao Duan NetEase Youdao R&D Beijing China CIKM 2009.
Foundations of Privacy Lecture 5 Lecturer: Moni Naor.
CMSC 414 Computer and Network Security Lecture 6 Jonathan Katz.
Calibrating Noise to Sensitivity in Private Data Analysis
Foundations of Privacy Lecture 11 Lecturer: Moni Naor.
On Everlasting Security in the Hybrid Bounded Storage Model Danny Harnik Moni Naor.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
Computational Entropy Joint works with Iftach Haitner (Tel Aviv), Thomas Holenstein (ETH Zurich), Omer Reingold (MSR-SVC), Hoeteck Wee (George Washington.
Foundations of Cryptography Rahul Jain CS6209, Jan – April 2011
Current Developments in Differential Privacy Salil Vadhan Center for Research on Computation & Society School of Engineering & Applied Sciences Harvard.
Private Analysis of Graphs
The Complexity of Differential Privacy Salil Vadhan Harvard University TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Defining and Achieving Differential Privacy Cynthia Dwork, Microsoft TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
APPLYING EPSILON-DIFFERENTIAL PRIVATE QUERY LOG RELEASING SCHEME TO DOCUMENT RETRIEVAL Sicong Zhang, Hui Yang, Lisa Singh Georgetown University August.
Foundations of Privacy Lecture 3 Lecturer: Moni Naor.
Differentially Private Marginals Release with Mutual Consistency and Error Independent of Sample Size Cynthia Dwork, Microsoft TexPoint fonts used in EMF.
Computational Entropy Joint works with Iftach Haitner (Tel Aviv), Thomas Holenstein (ETH Zurich), Omer Reingold (MSR-SVC), Hoeteck Wee (George Washington.
On Constructing Parallel Pseudorandom Generators from One-Way Functions Emanuele Viola Harvard University June 2005.
Cryptography Lecture 9 Stefan Dziembowski
Privacy of profile-based ad targeting Alexander Smal and Ilya Mironov.
Randomized Composable Core-sets for Submodular Maximization Morteza Zadimoghaddam and Vahab Mirrokni Google Research New York.
1 Information Security – Theory vs. Reality , Winter Lecture 10: Garbled circuits and obfuscation Eran Tromer Slides credit: Boaz.
Boosting and Differential Privacy Cynthia Dwork, Microsoft Research TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A.
Foundations of Privacy Lecture 5 Lecturer: Moni Naor.
Differential Privacy Some contents are borrowed from Adam Smith’s slides.
Differential Privacy: Theoretical & Practical Challenges Salil Vadhan Center for Research on Computation & Society John A. Paulson School of Engineering.
TAMPER DETECTION AND NON-MALLEABLE CODES Daniel Wichs (Northeastern U)
CRYPTOGRAPHIC HARDNESS OTHER FUNCTIONALITIES Andrej Bogdanov Chinese University of Hong Kong MACS Foundations of Cryptography| January 2016.
Differential Privacy (1). Outline  Background  Definition.
Differential Privacy Xintao Wu Oct 31, Sanitization approaches Input perturbation –Add noise to data –Generalize data Summary statistics –Means,
When is Key Derivation from Noisy Sources Possible?
1 Differential Privacy Cynthia Dwork Mamadou H. Diallo.
Privacy Preserving Outlier Detection using Locality Sensitive Hashing
Cryptography Lecture 3 Arpita Patra © Arpita Patra.
Cryptographic methods. Outline  Preliminary Assumptions Public-key encryption  Oblivious Transfer (OT)  Random share based methods  Homomorphic Encryption.
Reconciling Confidentiality Risk Measures from Statistics and Computer Science Jerry Reiter Department of Statistical Science Duke University.
Computational Differential Privacy Ilya Mironov (MICROSOFT) Omkant Pandey (UCLA) Omer Reingold (MICROSOFT) Salil Vadhan (HARVARD)
Private Data Management with Verification
Introduction to Machine Learning
Understanding Generalization in Adaptive Data Analysis
Privacy-preserving Release of Statistics: Differential Privacy
Digital Signature Schemes and the Random Oracle Model
Cryptographic Hash Functions Part I
Differential Privacy in Practice
Vitaly (the West Coast) Feldman
CS 154, Lecture 6: Communication Complexity
Current Developments in Differential Privacy
Published in: IEEE Transactions on Industrial Informatics
Presentation transcript:

Computational Complexity & Differential Privacy Salil Vadhan Harvard University Joint works with Cynthia Dwork, Kunal Talwar, Andrew McGregor, Ilya Mironov, Moni Naor, Omkant Pandey, Toni Pitassi, Omer Reingold, Guy Rothblum, Jon Ullman TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A A

Data Privacy: The Problem Given a dataset with sensitive information, such as: Health records Census data Search engine logs How can we: Compute and release useful functions of the dataset (utility) Without compromising info about individuals (privacy)?

Data Privacy: The Challenge Traditional approach (e.g. in HIPAA): “anonymize” by removing “personally identifying information (PII)” Many supposedly anonymized datasets have been subject to reidentification: – Gov. Weld’s medical record re-identified by linkage with voter records [Swe97]. – Netflix Challenge database reidentified by linkage with IMDb [NS08] – AOL search users reidentified by contents of their queries [BZ06] – …

Differential Privacy [DN03,DN04,BDMN05,DMNS06] A strong, new notion of privacy that: Is robust to auxiliary information possessed by an adversary Degrades gracefully under repetition/composition Allows for many useful computations

Differential Privacy: Definition Def: A randomized algorithm C : X n  Y is (  ) differentially private if for every two databases D 1, D 2 that differ on one row, and every set T  Y, Pr[C(D 1 )  T]  exp(  )  Pr[C(D 2 )  T] +   Pr[C(D 2  T] Think of  as a small constant, e.g.  =.01,  as cryptographically small, e.g.  = Database D  X n C(D) C curator

Differential Privacy: Interpretations Def: A randomized algorithm C : X n  Y is (  ) differentially private if for every two databases D 1, D 2 that differ on one row, and every set T  Y, Pr[C(D 1 )  T].  Pr[C(D 2  T] My data affects what an adversary sees by at most . Whatever an adversary learns about me, it could have learned from everyone else’s data. Above interpretations hold regardless of adversary’s auxiliary information. Composes gracefully (k repetitions ) k  differentially private) But no protection for information that is not localized to a few rows.

D = (x 1,…,x n )  X n Goal: given ¼ : X ! {0,1} estimate counting query  (D):=(  i  (x i ))/n within error   Example: X = {0,1} d  = conjunction on  k variables Counting query = k-way marginal e.g. What fraction of people in D smoke and have cancer? Differential Privacy: Example >35Smoker?Cancer?

D = (x 1,…,x n )  X n Goal: estimate (fractional) counting query  (D):=(  i  (x i ))/n within error   Solution: C(D) =  (D) + Lap(1/ ² n), where Lap( ¾ ) has density / exp(- ¾ |x|) ) error · ® provided n & 1/( ²® ) For k queries, suffices to have n & k/( ²® ) or even n & k 1/2 /( ²® ) ) Differential Privacy: Example

Other Differentially Private Algorithms histograms [DMNS06] contingency tables [BCDKMT07, GHRU11], machine learning [BDMN05,KLNRS08], logistic regression [CM08] clustering [BDMN05] social network analysis [HLMJ09] approximation algorithms [GLMRT10] …

Computational Complexity When do computational resource constraints change what is possible? Examples: Computational Learning Theory [Valiant `84]: small VC dimension  learnable with efficient algorithms (bad news) Cryptography [Diffie & Hellman `76]: don’t need long shared secrets against a computationally bounded adversary (good news)

This talk: Computational Complexity in Differential Privacy Q: Do computational resource constraints change what is possible? Computationally bounded curator – Makes differential privacy harder – Differentially private & accurate synthetic data infeasible to construct – Open: release other types of summaries/models? Computationally bounded adversary – Makes differential privacy easier – Provable gain in accuracy for 2-party protocols (e.g. for estimating Hamming distance)

A More Ambitious Goal: Noninteractive Data Release Original Database DSanitization C(D) C Goal: From C(D), can answer many questions about D, e.g. all counting queries associated with a large family of predicates P = { ¼ : X ! {0,1}}

Noninteractive Data Release: Desidarata ( ,  )-differential privacy: for every D 1, D 2 that differ in one row and every set T, Pr[C(D 1 )  T]  exp(  )  Pr[C(D 2 )  T]+  with  negligible Utility: C(D) allows answering many questions about D Computational efficiency: C is polynomial-time computable.

D = (x 1,…,x n )  X n P = {  : X  {0,1}} For any  P, want to estimate (from C(D)) counting query  (D):=(  i  (x i ))/n within accuracy error   Example: X = {0,1} d P = {conjunctions on  k variables} Counting query = k-way marginal e.g. What fraction of people in D smoke and have cancer? Utility: Counting Queries >35Smoker?Cancer?

Form of Output Ideal: C(D) is a synthetic dataset –   P |  (C(D))-  (D)|   – Values consistent – Use existing software Alternatives? – Explicit list of |P| answers (e.g. contingency table) – Median of several synthetic datasets [RR10] – Program M s.t.   P |M(  )-  (D)|   >35Smoker?Cancer?

Positive Results minimum database sizecomputational complexity referencegeneral Pk-way marginalssyntheticgeneral Pk-way marginals [DN03,DN04, BDMN05] O(|P| 1/2 /  O(d k/2 /  ) N D = (x 1,…,x n )  ({0,1} d ) n P = {  : {0,1} d  {0,1}}  (D):=(1/n)  i  (x i )  accuracy error  = privacy

Positive Results minimum database sizecomputational complexity referencegeneral Pk-way marginalssyntheticgeneral Pk-way marginals [DN03,DN04, BDMN05] O(|P| 1/2 /  O(d k/2 /  ) Npoly(n,|P|)poly(n,d k ) D = (x 1,…,x n )  ({0,1} d ) n P = {  : {0,1} d  {0,1}}  (D):=(1/n)  i  (x i )  accuracy error  = privacy

Positive Results minimum database sizecomputational complexity referencegeneral Pk-way marginalssyntheticgeneral Pk-way marginals [DN03,DN04, BDMN05] O(|P| 1/2 /  O(d k/2 /  ) Npoly(n,|P|)poly(n,d k ) [BDCKMT07] Õ((2d) k /  ) Ypoly(n,2 d ) D = (x 1,…,x n )  ({0,1} d ) n P = {  : {0,1} d  {0,1}}  (D):=(1/n)  i  (x i )  accuracy error  = privacy

Positive Results minimum database sizecomputational complexity referencegeneral Pk-way marginalssyntheticgeneral Pk-way marginals [DN03,DN04, BDMN05] O(|P| 1/2 /  O(d k/2 /  ) Npoly(n,|P|)poly(n,d k ) [BDCKMT07] Õ((2d) k /  ) Ypoly(n,2 d ) [BLR08] O(d  log|P|/    )Õ(dk/    ) Y D = (x 1,…,x n )  ({0,1} d ) n P = {  : {0,1} d  {0,1}}  (D):=(1/n)  i  (x i )  accuracy error  = privacy

Positive Results minimum database sizecomputational complexity referencegeneral Pk-way marginalssyntheticgeneral Pk-way marginals [DN03,DN04, BDMN05] O(|P| 1/2 /  O(d k/2 /  ) Npoly(n,|P|)poly(n,d k ) [BDCKMT07] Õ((2d) k /  ) Ypoly(n,2 d ) [BLR08] O(d  log|P|/    )Õ(dk/    ) Yqpoly(n,|P|,2 d )qpoly(n,2 d ) D = (x 1,…,x n )  ({0,1} d ) n P = {  : {0,1} d  {0,1}}  (D):=(1/n)  i  (x i )  accuracy error  = privacy

Positive Results minimum database sizecomputational complexity referencegeneral Pk-way marginalssyntheticgeneral Pk-way marginals [DN03,DN04, BDMN05] O(|P| 1/2 /  O(d k/2 /  ) Npoly(n,|P|)poly(n,d k ) [BDCKMT07] Õ((2d) k /  ) Ypoly(n,2 d ) [BLR08] O(d  log|P|/    )Õ(dk/    ) Yqpoly(n,|P|,2 d )qpoly(n,2 d ) [DNRRV09, DRV10, HR10] O(d  log 2 |P|/     )Õ(dk 2 /     ) Ypoly(n,|P|,2 d ) D = (x 1,…,x n )  ({0,1} d ) n P = {  : {0,1} d  {0,1}}  (D):=(1/n)  i  (x i )  error in accuracy  = privacy Summary: Can construct synthetic databases accurate on huge families of counting queries, but complexity may be exponential in dimensions of data and query set P. Question: is the complexity inherent?

Our Result: Intractability of Synthetic Data Informally: Producing accurate & differentially private synthetic data is as hard as breaking cryptography (e.g. factoring large integers). Inherently exponential in dimensionality of data (and in dimensionality of queries).

Our Result: Intractability of Synthetic Data Thm [UV11, building on DNRRV10]: Under standard crypto assumptions (OWF), there is no n=poly(d) and curator that: – Produces synthetic databases. – Is differentially private. – Runs in time poly(n,d). – Achieves error  =.01 for 2-way marginals. Proof overview: 1.Use digital signatures to show hardness for a complicated (but efficiently computable) family of counting queries [DNRRV10] 2.Use PCPs to reduce complicated queries to simple ones [UV11].

Tool 1: Digital Signature Schemes A digital signature scheme consists of 3 poly-time algorithms (Gen,Sign,Ver): On security parameter d, Gen(d) = (SK,PK)  {0,1} d  {0,1} d On m  {0,1} d, can compute  =Sign SK (m)  {0,1} d s.t. Ver PK (m,  )=1 Given many (m,  ) pairs, infeasible to generate new (m’,  ’) satisfying Ver PK Thm [NY89,Rom90]: Digital signature schemes exist iff one-way functions exist.

Hard-to-Sanitize Databases I [DNRRV10] Ver PK (D) = 1 m1m1 Sign SK (m 1 ) m2m2 Sign SK (m 2 ) m3m3 Sign SK (m 3 )  mnmn Sign SK (m n ) Generate random (PK,SK)  Gen(d), m 1, m 2,…, m n  {0,1} d D m’ 1 11 m’ 2 22  m’ k kk curator C(D) Case 1: m’ i  D  Forgery! Case 2: m’ i  D  Reidentification! If error · ® wrt Ver PK : Ver PK (C(D))  1-  > 0  i Ver PK (m’ i,  i )=1

Tool 2: Probabilistically Checkable Proofs The PCP Theorem:  efficient algorithms (Red,Enc,Dec) s.t. w s.t. V(w)=1 Circuit V of size poly(d) Enc Red Set of 3-clauses on d’=poly(d) vars  V ={x 1  x 5   x 7,  x 1  v 5  x d’,…} z  {0,1} d’ satisfying all of  V z’  {0,1} d’ satisfying.99 fraction of  V Decw’ s.t. V(w’)=1

Hard-to-Sanitize Databases II [UV11] Let  PK = Red(Ver PK ) Each clause in  PK is satisfied by all z i m1m1 Sign SK (m 1 ) m2m2 Sign SK (m 2 ) m3m3 Sign SK (m 3 )  mnmn Sign SK (m n ) D z’ 1 z’ 2  z’ k curator C(D) Case 1: m’ i  D  Forgery! Case 2: m’ i  D  Reidentification! If error ·.01 wrt 3-way marginals: Each clause in  PK is satisfied by .99 fraction of z’ i  i s.t. z’ i satisfies  1-  fraction of clauses Dec(z’ i ) = valid (m’ i,  i ) z1z1 z2z2 z3z3  znzn Enc Ver PK

Conclusions Producing private, synthetic databases that preserve simple statistics requires computation exponential in the dimension of the data. How to bypass? Average-case accuracy: Heuristics that don’t give good accuracy on all databases, only those from some class of models. Non-synthetic data: – Hardness extends to some generalizations of synthetic data (e.g. medians of synthetic data). – Thm [DNRRV09]: For unnatural but efficiently computable P  efficient curators “iff”  efficient “traitor-tracing” schemes – But for natural P (e.g. P={all marginals}), wide open!

Computational Differential Privacy Differential privacy protects even against adversaries with unlimited computational power. Can we gain by restricting to adversaries with bounded (but still huge) computational power? – Better accuracy/utility? – Enormous success in cryptography from considering computationally bounded adversaries.

Computational Differential Privacy: state of affairs Implicit in works on distributed implementations of differential privacy [DKMMN06,BKO08] Formal definitions in [MPRV09], related to “dense model theorems” in additive combinatorics & pseudorandomness. Provable gains in accuracy for 2-party protocols – Computational differential privacy has error O (1/ ² ) [MPRV09] – 2-party differential privacy requires error £ ~(n 1/2 ) [MMPRTV10] – Info-theoretic 2-party DP related to communication complexity, randomness extraction. [MMPRTV10] Little gain for client/server setting [GKK11]

2-Party Privacy 2-party (& multiparty) privacy: each party has a sensitive dataset, want to do a joint computation f(D A,D B ) m1m1 m2m2 m3m3 m k-1 mkmk DADA x1x1 x2x2  xnxn DBDB y1y1 y2y2  ymym Z A  f(D A,D B ) Z B  f(D A,D B ) A’s view should be a (computational) differentially private function of D B (even if A deviates from protocol), and vice-versa

Conclusions Computational complexity is relevant to differential privacy. Bad news: producing synthetic data is intractable Good news: better protocols against bounded adversaries Interaction with differential privacy likely to benefit complexity theory too.