Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Privacy-Preserving Distributed Information Sharing Nan Zhang and Wei Zhao Texas A&M University, USA.

Similar presentations


Presentation on theme: "1 Privacy-Preserving Distributed Information Sharing Nan Zhang and Wei Zhao Texas A&M University, USA."— Presentation transcript:

1 1 Privacy-Preserving Distributed Information Sharing Nan Zhang and Wei Zhao Texas A&M University, USA

2 2 Outline Motivation Dealing with malicious adversaries Existing and new protocols Conclusion

3 3 Information Sharing between Autonomous Entities Problem definition Knowledge

4 4 Example Supplier –Product list Consumer –Shopping list Secret Weapon I Secret Weapon V Dream Machine Cancer Medicine Perpetual Machine … Secret Weapon I Secret Weapon II Secret Weapon III Secret Weapon IV Secret Weapon V … Secret Weapon I Secret Weapon V … Contract SECRET

5 5 Privacy Concern [www.privacy.org, 2002] Privacy laws Countries with enacted or pending omnibus privacy laws HIPAA Health Insurance Portability and Accountability Act

6 6 Privacy-Preserving Information Sharing Sharing information across private databases without violating each party’s privacy.

7 7 Objectives To ensure accuracy of information sharing results To guarantee privacy of each party How do we measure accuracy and privacy?

8 8 Measurement of Accuracy Traditional measure of accuracy 1, if all parties obtain correct information sharing results 0, otherwise We measure accuracy by the expected value of traditional measure –Probability that all parties obtain correct information sharing results fails lala 1– l a accomplishes 01

9 9 Measurement of Privacy Disclosure Traditional measure in Cryptography 0, if no privacy disclosure 1, otherwise Our measure in information sharing –Percentage of private information compromised undisclosed lplp 1– l p disclosed 01

10 10 Baseline Architecture With trusted third party Without trusted third party TTP

11 11 Local Processing Module Database System Architecture INTERNET

12 12 Local Processing Module Database INTERNET External Attacks Defense against these attacks can occur by using traditional system security measures

13 13 Local Processing Module Database INTERNET Internal Attacks Internal party as adversary

14 14 INTERNET Semi-honest Adversaries Private information of the other party Properly follow the protocol Record intermediate computation and communication Passive attack Properly follow the protocol Record intermediate computation and communication Passive attack

15 15 Protocols Against Semi-honest Adversaries Almost all existing protocols Can be efficient Unrealistic assumption: semi-honest

16 16 INTERNET Malicious Adversaries Private information of the other party Can do whatever it wants May revise local processing module and/or alter inputs Active attack Can do whatever it wants May revise local processing module and/or alter inputs Active attack

17 17 Protocols Against Malicious Adversaries A few protocols exist, with sporadic restrictions Inefficient

18 18 A Dilemma Semi-honest Malicious UNREALISTIC TOO DIFFICULT

19 19 Our Goal: Defend Against Malicious Adversaries Effectively and Efficiently But how?

20 20 Our Approach I Generalization of privacy & accuracy measures Continuous accuracy measure Continuous privacy measure undisclosed lplp 1– l p disclosed RECALL fails lala 1– l a accomplishes RECALL

21 21 Our Approach II Classification of malicious adversaries Behavior Priority

22 22 Outline Motivation Dealing with malicious adversaries Existing and new protocols Conclusion

23 23 Classification of Adversaries Priority of Adversary –To obtain the privacy of other parties –To accomplish information sharing

24 24 Consumer needs Secret Weapon IV PRIVACY BREACH Secret Weapon I Secret Weapon V Dream Machine Cancer Medicine Perpetual Machine … Adversaries that Care About Information Sharing Supplier –Product list Consumer –Shopping list Secret Weapon IV Secret Weapon I Secret Weapon V … Secret Weapon IV Secret Weapon I Secret Weapon II Secret Weapon III Secret Weapon IV Secret Weapon V …

25 25 Secret Weapon I Secret Weapon II Secret Weapon III Secret Weapon IV Secret Weapon V … Secret Weapon I Secret Weapon V Dream Machine Cancer Medicine Perpetual Machine … Adversaries that Care About Information Sharing Supplier –Product list Consumer –Shopping list Secret Weapon IV Secret Weapon I Secret Weapon V … Secret Weapon IV Secret Weapon I Secret Weapon V … Secret Weapon IV An adversary may be penalized if some parties cannot obtain the accurate information sharing results.

26 26 Priority of Adversary Priority of adversary Information sharing as the first priority Privacy intrusion as the first priority

27 27 Measure of Adversary’s Priority Priority is measured by , such that the goal of the adversary is to maximize u = (1 –  ) l a +  l p. l a : {0,1}, probability that all parties obtain correct information sharing results l p : [0,1], percentage of other parties’ private information that is compromised by the adversary.

28 28 Classification of Malicious Adversaries by Their Priority Priority of adversary Weakly malicious Strongly malicious Honest  = 0 0 <  < 1/2 1/2    1 u = (1 –  ) l a +  l p Information sharing as the first priority Privacy intrusion as the first priority

29 29 Adversary Space Behavior Priority Semi-honest Weakly Malicious Strongly Malicious

30 30 Outline Problem definition Dealing with malicious adversaries Existing and new protocols Conclusion

31 31 Protocol DE Double Encryption Existing Protocol [R. Agrawal et. al, 2003] For intersection of two datasets Basic idea: aa BA ABAB

32 32 Protocol DE Bob Input: Datasets A, B. Output: A  B. Alice A:8B:10 A AA A AB A AB A AA AAB A AA ABAB ABAB ABAB AAB A AA AAB A AB Same order Same order

33 33 Protocol TPS Trust Party with the Smallest Dataset Our New Protocol I Basic Idea: Size: 8Size: 10 TRUST

34 34 Assumptions The distribution of the number of data points of each party is known by all parties For the sake of simplicity, we assume that both parties have the same distribution

35 35 Bob Protocol TPS Input: Datasets A, B. Output: A  B. 8 Alice A:8 10 8 B:10 108 A AA A AB A AB A AA AAB A AA ABAB ABAB ABAB AAB A AA AAB A AB

36 36 Protocol RPL Reject Parties with the Too Large Dataset Our New Protocol II Basic Idea: Reject parties whose datasets are larger than a threshold set by the honest parties

37 37 Protocol RPL Bob Input: Datasets A, B. Output: A  B. Alice A:8B:10 108 A AA A AB A AB AAA A AA AAB A AB AA ABAB ABAB ABAB ABAB A AA AAA A AA AAB A AB Is 10 too large? Is 8 too large?

38 38 Performance: Efficiency DE 4| V 0 | TPSRPL 3| V 0 | 2| V 0 | Lower bound to be secure against semi-honest adversaries Lower bound to be secure against weakly malicious adversaries Communication Overhead

39 39 Performance: Defense Against Weakly Malicious Adversaries Protocol DE Protocol RPL Protocol TPS DE 100 80 60 40 20 0 TPSRPL l a ( s A, s D 0 ) (%) 10 2 100 80 60 40 20 0 |V||V| l p ( s A, s D 0 ) (%) 10 2.3 10 2.6 10 2.9 10 3.2 10 3.5 Privacy Disclosure Accuracy Percentage of data compromised by the adversary Percentage of data compromised by the adversary Probability that all parties obtain accurate information sharing results Probability that all parties obtain accurate information sharing results

40 40 Defense Against Strongly Malicious Adversaries Performance Evaluation Protocol DE Protocol TPS Protocol RPL when  = 10 Protocol RPL when  = 2 Protocol RPL when   1 10 2 100 80 60 40 20 0 |V||V| l p ( s A, s D 0 ) (%) 10 2.3 10 2.6 10 2.9 10 3.2 10 3.5 Privacy Disclosure DE 100 80 60 40 20 0 TPSRPL l a ( s A, s D 0 ) (%) Accuracy System parameter Penalty / Benefit on Privacy intrusion attack System parameter Penalty / Benefit on Privacy intrusion attack

41 41 Outline Problem definition Dealing with malicious adversaries Existing and new protocols Conclusion

42 42 Final Remarks Simple and efficient protocols exist if we –Adopt the continuous measure of privacy disclosure –Constrain the adversary goal to be weakly malicious Future work –Additional set operation protocols –Multiple correlated attacks

43 43 Q&A Thank you

44 44 Backup Slides

45 45 Weakly and Strongly Malicious Priority of adversary Weakly malicious Strongly malicious Honest  = 0 0 <  < 1/2 1/2    1 u = (1 –  ) l a +  l p Information sharing as first priority Privacy intrusion as first priority If successful intrusion  failed information sharing then the adversary will not perform the intrusion If successful intrusion  failed information sharing then the adversary will not perform the intrusion

46 46 Adversary Classification Adversaries Semi-honestMalicious Weakly Malicious Strongly Malicious √

47 47 Goal of adversary: Maximize u = (1 –  ) l a +  l p. Weakly malicious means  < 1/2. The optimal strategy for weakly malicious adversaries ( s A ) is to alter its dataset by V 1 ′ s.t. V 1  V 1 ′ Defense Against Weakly Malicious Adversaries Methodology RECALL If successful intrusion  failed information sharing then the adversary will not perform the intrusion If successful intrusion  failed information sharing then the adversary will not perform the intrusion

48 48 Basic Idea of Defense Against Weakly Malicious Adversaries Give them a dilemma Weakly Malicious If successful intrusion  failed information sharing then the adversary will not perform the intrusion If successful intrusion  failed information sharing then the adversary will not perform the intrusion No intrusion Successful Information Sharing No intrusion Successful Information Sharing Intrusion Failed Information Sharing Intrusion Failed Information Sharing ? RECALL

49 49 Defense Against Strongly Malicious Adversaries We have to sacrifice some honest parties. –Because we cannot distinguish them from strongly malicious adversaries. Alice Eve … Justin Alice Eve … Justin Strongly MaliciousHonest ?

50 50 When an honest party takes the strategy ( s D 0 ) to strictly follow the protocol, there is l p ( s A, s D 0 )   Pr{ v  V 0 | v  V }/| V | Privacy Disclosure w/ Weakly Malicious Adversaries

51 51 Defense Against Strongly Malicious Adversaries Methodology Nash Equilibrium –A pair of attack strategy and defensive strategy { s A, s D } such that –Thus, we can consider { s A, s D } as the set of strategies taken by rational parties Whoever moves from the strategy pays the penalty

52 52 Strategies Honest Low privacy, high accuracy Low accuracy, high privacy Strongly Malicious High risk, high payoff Low risk, low payoff Large V 1 ′ Small V 1 ′ Large tolerable V 1 ′ low tolerable V 1 ′

53 53 Communication Overhead Lower bound to be secure against semi- honest adversaries ( V 0 ’ + V 1 ’ ) log(| V |) Lower bound to be secure against weakly malicious adversaries 2( V 0 ’ + V 1 ’ ) log(| V |) Protocol A: ( V 0 ’ + V 1 ’ +min( V 0 ’ + V 1 ’ )) log(| V |)


Download ppt "1 Privacy-Preserving Distributed Information Sharing Nan Zhang and Wei Zhao Texas A&M University, USA."

Similar presentations


Ads by Google