1 Privacy-Preserving Distributed Information Sharing Nan Zhang and Wei Zhao Texas A&M University, USA.

Slides:



Advertisements
Similar presentations
Polylogarithmic Private Approximations and Efficient Matching
Advertisements

Efficient Private Approximation Protocols Piotr Indyk David Woodruff Work in progress.
Fair Computation with Rational Players Adam Groce and Jonathan Katz University of Maryland.
Foundations of Cryptography Lecture 7 Lecturer:Danny Harnik.
I have a DREAM! (DiffeRentially privatE smArt Metering) Gergely Acs and Claude Castelluccia {gergely.acs, INRIA 2011.
Secure Multiparty Computations on Bitcoin
NAU HIPAA Awareness Training
Distribution and Revocation of Cryptographic Keys in Sensor Networks Amrinder Singh Dept. of Computer Science Virginia Tech.
Semi-Honest to Malicious Oblivious-Transfer The Black-box Way Iftach Haitner Weizmann Institute of Science.
Markov Game Analysis for Attack and Defense of Power Networks Chris Y. T. Ma, David K. Y. Yau, Xin Lou, and Nageswara S. V. Rao.
Rational Oblivious Transfer KARTIK NAYAK, XIONG FAN.
CS555Topic 241 Cryptography CS 555 Topic 24: Secure Function Evaluation.
On Fair Exchange, Fair Coins and Fair Sampling Shashank Agrawal, Manoj Prabhakaran University of Illinois at Urbana-Champaign.
Digital Signatures and Hash Functions. Digital Signatures.
Achieving Byzantine Agreement and Broadcast against Rational Adversaries Adam Groce Aishwarya Thiruvengadam Ateeq Sharfuddin CMSC 858F: Algorithmic Game.
Amortizing Garbled Circuits Yan Huang, Jonathan Katz, Alex Malozemoff (UMD) Vlad Kolesnikov (Bell Labs) Ranjit Kumaresan (Technion) Cut-and-Choose Yao-Based.
Introduction to Modern Cryptography, Lecture 12 Secure Multi-Party Computation.
Eran Omri, Bar-Ilan University Joint work with Amos Beimel and Ilan Orlov, BGU Ilan Orlov…!??!!
Yan Huang, Jonathan Katz, David Evans University of Maryland, University of Virginia Efficient Secure Two-Party Computation Using Symmetric Cut-and-Choose.
CSCE 715 Ankur Jain 11/16/2010. Introduction Design Goals Framework SDT Protocol Achievements of Goals Overhead of SDT Conclusion.
CMSC 414 Computer and Network Security Lecture 6 Jonathan Katz.
An architecture for Privacy Preserving Mining of Client Information Jaideep Vaidya Purdue University This is joint work with Murat.
Oblivious Transfer based on the McEliece Assumptions
Co-operative Private Equality Test(CPET) Ronghua Li and Chuan-Kun Wu (received June 21, 2005; revised and accepted July 4, 2005) International Journal.
Asymmetric Cryptography part 1 & 2 Haya Shulman Many thanks to Amir Herzberg who donated some of the slides from
Analysis of Key Agreement Protocols Brita Vesterås Supervisor: Chik How Tan.
1 Introduction to Secure Computation Benny Pinkas HP Labs, Princeton.
CMSC 414 Computer and Network Security Lecture 6 Jonathan Katz.
1 CS 194: Distributed Systems Security Scott Shenker and Ion Stoica Computer Science Division Department of Electrical Engineering and Computer Sciences.
EECS 598 Fall ’01 Quantum Cryptography Presentation By George Mathew.
Secure Message Transmission In Asynchronous Directed Networks Kannan Srinathan, Center for Security, Theory and Algorithmic Research, IIIT-Hyderabad. In.
Public-Key Encryption with Lazy Parties Kenji Yasunaga Institute of Systems, Information Technologies and Nanotechnologies (ISIT), Japan Presented at SCN.
Efficient and Robust Private Set Intersection and multiparty multivariate polynomials Dana Dachman-Soled 1, Tal Malkin 1, Mariana Raykova 1, Moti Yung.
Overview of Privacy Preserving Techniques.  This is a high-level summary of the state-of-the-art privacy preserving techniques and research areas  Focus.
Improving Intrusion Detection System Taminee Shinasharkey CS689 11/2/00.
Secure Computation of the k’th Ranked Element Gagan Aggarwal Stanford University Joint work with Nina Mishra and Benny Pinkas, HP Labs.
Paraty, Quantum Information School, August 2007 Antonio Acín ICFO-Institut de Ciències Fotòniques (Barcelona) Quantum Cryptography (III)
Cryptography, Authentication and Digital Signatures
On the Practical Feasibility of Secure Distributed Computing A Case Study Gregory Neven, Frank Piessens, Bart De Decker Dept. of Computer Science, K.U.Leuven.
Mining Multiple Private Databases Topk Queries Across Multiple Private Databases (2005) Li Xiong (Emory University) Subramanyam Chitti (GA Tech) Ling Liu.
Secure two-party computation: a visual way by Paolo D’Arco and Roberto De Prisco.
Security protocols and their verification Mark Ryan University of Birmingham Midlands Graduate School University of Birmingham April 2005 Steve Kremer.
Device-independent security in quantum key distribution Lluis Masanes ICFO-The Institute of Photonic Sciences arXiv:
1 Privacy Preserving Data Mining Haiqin Yang Extracted from a ppt “Secure Multiparty Computation and Privacy” Added “Privacy Preserving SVM”
Cryptography In the Bounded Quantum-Storage Model Christian Schaffner, BRICS University of Århus, Denmark ECRYPT Autumn School, Bertinoro Wednesday, October.
Personalized Social Recommendations – Accurate or Private? A. Machanavajjhala (Yahoo!), with A. Korolova (Stanford), A. Das Sarma (Google) 1.
On the Communication Complexity of SFE with Long Output Daniel Wichs (Northeastern) joint work with Pavel Hubáček.
Taiming Feng, Chuang wang, Wensheng Zhang and Lu Ruan INFOCOM 2008 Presented by Mary Nader.
Collusion-Resistant Anonymous Data Collection Method Mafruz Zaman Ashrafi See-Kiong Ng Institute for Infocomm Research Singapore.
Mining Multiple Private Databases Topk Queries Across Multiple Private Databases (2005) Mining Multiple Private Databases Using a kNN Classifier (2007)
CS555Topic 251 Cryptography CS 555 Topic 25: Quantum Crpytography.
Rational Cryptography Some Recent Results Jonathan Katz University of Maryland.
Secure Computation (Lecture 2) Arpita Patra. Vishwaroop of MPC.
Privacy Preserving Payments in Credit Networks By: Moreno-Sanchez et al from Saarland University Presented By: Cody Watson Some Slides Borrowed From NDSS’15.
Authenticated Key Exchange I. Definitions I. MAP I. matching conversations II. oracles II. (I)KA II. AKEP2 III. AKEP2 Security I. Session Keys II. Perfect.
Top 10 Series Changes to HIPAA Devon Bernard AOPA Reimbursement Services Coordinator.
Andrew Lindell Aladdin Knowledge Systems and Bar-Ilan University 04/08/08 CRYP-106 Efficient Fully-Simulatable Oblivious Transfer.
Hierarchical Trust Management for Wireless Sensor Networks and Its Applications to Trust-Based Routing and Intrusion Detection Wenhai Sun & Ruide Zhang.
Quantum Cryptography Antonio Acín
Secure Data Outsourcing
1 Authenticated Key Exchange Rocky K. C. Chang 20 March 2007.
Slide 1/20 Defending Against Strategic Adversaries in Dynamic Pricing Markets for Smart Grids Paul Wood, Saurabh Bagchi Purdue University
Privacy Preserving Outlier Detection using Locality Sensitive Hashing
Privacy-Preserving Data Aggregation without Secure Channel: Multivariate Polynomial Evaluation Taeho Jung 1, XuFei Mao 2, Xiang-Yang Li 1, Shao-Jie Tang.
 Health Insurance and Accountability Act Cornelius Villalon Jr.
Multi-Party Computation r n parties: P 1,…,P n  P i has input s i  Parties want to compute f(s 1,…,s n ) together  P i doesn’t want any information.
Lower bounds for Unconditionally Secure MPC Ivan Damgård Jesper Buus Nielsen Antigoni Polychroniadou Aarhus University.
Helger Lipmaa University of Tartu, Estonia
Helen: Maliciously Secure Coopetitive Learning for Linear Models
Presentation transcript:

1 Privacy-Preserving Distributed Information Sharing Nan Zhang and Wei Zhao Texas A&M University, USA

2 Outline Motivation Dealing with malicious adversaries Existing and new protocols Conclusion

3 Information Sharing between Autonomous Entities Problem definition Knowledge

4 Example Supplier –Product list Consumer –Shopping list Secret Weapon I Secret Weapon V Dream Machine Cancer Medicine Perpetual Machine … Secret Weapon I Secret Weapon II Secret Weapon III Secret Weapon IV Secret Weapon V … Secret Weapon I Secret Weapon V … Contract SECRET

5 Privacy Concern [ 2002] Privacy laws Countries with enacted or pending omnibus privacy laws HIPAA Health Insurance Portability and Accountability Act

6 Privacy-Preserving Information Sharing Sharing information across private databases without violating each party’s privacy.

7 Objectives To ensure accuracy of information sharing results To guarantee privacy of each party How do we measure accuracy and privacy?

8 Measurement of Accuracy Traditional measure of accuracy 1, if all parties obtain correct information sharing results 0, otherwise We measure accuracy by the expected value of traditional measure –Probability that all parties obtain correct information sharing results fails lala 1– l a accomplishes 01

9 Measurement of Privacy Disclosure Traditional measure in Cryptography 0, if no privacy disclosure 1, otherwise Our measure in information sharing –Percentage of private information compromised undisclosed lplp 1– l p disclosed 01

10 Baseline Architecture With trusted third party Without trusted third party TTP

11 Local Processing Module Database System Architecture INTERNET

12 Local Processing Module Database INTERNET External Attacks Defense against these attacks can occur by using traditional system security measures

13 Local Processing Module Database INTERNET Internal Attacks Internal party as adversary

14 INTERNET Semi-honest Adversaries Private information of the other party Properly follow the protocol Record intermediate computation and communication Passive attack Properly follow the protocol Record intermediate computation and communication Passive attack

15 Protocols Against Semi-honest Adversaries Almost all existing protocols Can be efficient Unrealistic assumption: semi-honest

16 INTERNET Malicious Adversaries Private information of the other party Can do whatever it wants May revise local processing module and/or alter inputs Active attack Can do whatever it wants May revise local processing module and/or alter inputs Active attack

17 Protocols Against Malicious Adversaries A few protocols exist, with sporadic restrictions Inefficient

18 A Dilemma Semi-honest Malicious UNREALISTIC TOO DIFFICULT

19 Our Goal: Defend Against Malicious Adversaries Effectively and Efficiently But how?

20 Our Approach I Generalization of privacy & accuracy measures Continuous accuracy measure Continuous privacy measure undisclosed lplp 1– l p disclosed RECALL fails lala 1– l a accomplishes RECALL

21 Our Approach II Classification of malicious adversaries Behavior Priority

22 Outline Motivation Dealing with malicious adversaries Existing and new protocols Conclusion

23 Classification of Adversaries Priority of Adversary –To obtain the privacy of other parties –To accomplish information sharing

24 Consumer needs Secret Weapon IV PRIVACY BREACH Secret Weapon I Secret Weapon V Dream Machine Cancer Medicine Perpetual Machine … Adversaries that Care About Information Sharing Supplier –Product list Consumer –Shopping list Secret Weapon IV Secret Weapon I Secret Weapon V … Secret Weapon IV Secret Weapon I Secret Weapon II Secret Weapon III Secret Weapon IV Secret Weapon V …

25 Secret Weapon I Secret Weapon II Secret Weapon III Secret Weapon IV Secret Weapon V … Secret Weapon I Secret Weapon V Dream Machine Cancer Medicine Perpetual Machine … Adversaries that Care About Information Sharing Supplier –Product list Consumer –Shopping list Secret Weapon IV Secret Weapon I Secret Weapon V … Secret Weapon IV Secret Weapon I Secret Weapon V … Secret Weapon IV An adversary may be penalized if some parties cannot obtain the accurate information sharing results.

26 Priority of Adversary Priority of adversary Information sharing as the first priority Privacy intrusion as the first priority

27 Measure of Adversary’s Priority Priority is measured by , such that the goal of the adversary is to maximize u = (1 –  ) l a +  l p. l a : {0,1}, probability that all parties obtain correct information sharing results l p : [0,1], percentage of other parties’ private information that is compromised by the adversary.

28 Classification of Malicious Adversaries by Their Priority Priority of adversary Weakly malicious Strongly malicious Honest  = 0 0 <  < 1/2 1/2    1 u = (1 –  ) l a +  l p Information sharing as the first priority Privacy intrusion as the first priority

29 Adversary Space Behavior Priority Semi-honest Weakly Malicious Strongly Malicious

30 Outline Problem definition Dealing with malicious adversaries Existing and new protocols Conclusion

31 Protocol DE Double Encryption Existing Protocol [R. Agrawal et. al, 2003] For intersection of two datasets Basic idea: aa BA ABAB

32 Protocol DE Bob Input: Datasets A, B. Output: A  B. Alice A:8B:10 A AA A AB A AB A AA AAB A AA ABAB ABAB ABAB AAB A AA AAB A AB Same order Same order

33 Protocol TPS Trust Party with the Smallest Dataset Our New Protocol I Basic Idea: Size: 8Size: 10 TRUST

34 Assumptions The distribution of the number of data points of each party is known by all parties For the sake of simplicity, we assume that both parties have the same distribution

35 Bob Protocol TPS Input: Datasets A, B. Output: A  B. 8 Alice A: B: A AA A AB A AB A AA AAB A AA ABAB ABAB ABAB AAB A AA AAB A AB

36 Protocol RPL Reject Parties with the Too Large Dataset Our New Protocol II Basic Idea: Reject parties whose datasets are larger than a threshold set by the honest parties

37 Protocol RPL Bob Input: Datasets A, B. Output: A  B. Alice A:8B: A AA A AB A AB AAA A AA AAB A AB AA ABAB ABAB ABAB ABAB A AA AAA A AA AAB A AB Is 10 too large? Is 8 too large?

38 Performance: Efficiency DE 4| V 0 | TPSRPL 3| V 0 | 2| V 0 | Lower bound to be secure against semi-honest adversaries Lower bound to be secure against weakly malicious adversaries Communication Overhead

39 Performance: Defense Against Weakly Malicious Adversaries Protocol DE Protocol RPL Protocol TPS DE TPSRPL l a ( s A, s D 0 ) (%) |V||V| l p ( s A, s D 0 ) (%) Privacy Disclosure Accuracy Percentage of data compromised by the adversary Percentage of data compromised by the adversary Probability that all parties obtain accurate information sharing results Probability that all parties obtain accurate information sharing results

40 Defense Against Strongly Malicious Adversaries Performance Evaluation Protocol DE Protocol TPS Protocol RPL when  = 10 Protocol RPL when  = 2 Protocol RPL when   |V||V| l p ( s A, s D 0 ) (%) Privacy Disclosure DE TPSRPL l a ( s A, s D 0 ) (%) Accuracy System parameter Penalty / Benefit on Privacy intrusion attack System parameter Penalty / Benefit on Privacy intrusion attack

41 Outline Problem definition Dealing with malicious adversaries Existing and new protocols Conclusion

42 Final Remarks Simple and efficient protocols exist if we –Adopt the continuous measure of privacy disclosure –Constrain the adversary goal to be weakly malicious Future work –Additional set operation protocols –Multiple correlated attacks

43 Q&A Thank you

44 Backup Slides

45 Weakly and Strongly Malicious Priority of adversary Weakly malicious Strongly malicious Honest  = 0 0 <  < 1/2 1/2    1 u = (1 –  ) l a +  l p Information sharing as first priority Privacy intrusion as first priority If successful intrusion  failed information sharing then the adversary will not perform the intrusion If successful intrusion  failed information sharing then the adversary will not perform the intrusion

46 Adversary Classification Adversaries Semi-honestMalicious Weakly Malicious Strongly Malicious √

47 Goal of adversary: Maximize u = (1 –  ) l a +  l p. Weakly malicious means  < 1/2. The optimal strategy for weakly malicious adversaries ( s A ) is to alter its dataset by V 1 ′ s.t. V 1  V 1 ′ Defense Against Weakly Malicious Adversaries Methodology RECALL If successful intrusion  failed information sharing then the adversary will not perform the intrusion If successful intrusion  failed information sharing then the adversary will not perform the intrusion

48 Basic Idea of Defense Against Weakly Malicious Adversaries Give them a dilemma Weakly Malicious If successful intrusion  failed information sharing then the adversary will not perform the intrusion If successful intrusion  failed information sharing then the adversary will not perform the intrusion No intrusion Successful Information Sharing No intrusion Successful Information Sharing Intrusion Failed Information Sharing Intrusion Failed Information Sharing ? RECALL

49 Defense Against Strongly Malicious Adversaries We have to sacrifice some honest parties. –Because we cannot distinguish them from strongly malicious adversaries. Alice Eve … Justin Alice Eve … Justin Strongly MaliciousHonest ?

50 When an honest party takes the strategy ( s D 0 ) to strictly follow the protocol, there is l p ( s A, s D 0 )   Pr{ v  V 0 | v  V }/| V | Privacy Disclosure w/ Weakly Malicious Adversaries

51 Defense Against Strongly Malicious Adversaries Methodology Nash Equilibrium –A pair of attack strategy and defensive strategy { s A, s D } such that –Thus, we can consider { s A, s D } as the set of strategies taken by rational parties Whoever moves from the strategy pays the penalty

52 Strategies Honest Low privacy, high accuracy Low accuracy, high privacy Strongly Malicious High risk, high payoff Low risk, low payoff Large V 1 ′ Small V 1 ′ Large tolerable V 1 ′ low tolerable V 1 ′

53 Communication Overhead Lower bound to be secure against semi- honest adversaries ( V 0 ’ + V 1 ’ ) log(| V |) Lower bound to be secure against weakly malicious adversaries 2( V 0 ’ + V 1 ’ ) log(| V |) Protocol A: ( V 0 ’ + V 1 ’ +min( V 0 ’ + V 1 ’ )) log(| V |)