Presentation is loading. Please wait.

Presentation is loading. Please wait.

Topic 21: Data Privacy1 Information Security CS 526 Topic 21: Data Privacy.

Similar presentations


Presentation on theme: "Topic 21: Data Privacy1 Information Security CS 526 Topic 21: Data Privacy."— Presentation transcript:

1 Topic 21: Data Privacy1 Information Security CS 526 Topic 21: Data Privacy

2 What is Privacy? Privacy is the protection of an individual’s personal information. Privacy is the rights and obligations of individuals and organizations with respect to the collection, use, retention, disclosure and disposal of personal information. Privacy  Confidentiality 2Topic 21: Data Privacy

3 OECD Privacy Principles 1. Collection Limitation Principle –There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject. 2. Data Quality Principle –Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date. 3Topic 21: Data Privacy

4 OECD Privacy Principles 3. Purpose Specification Principle –The purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose. 4. Use Limitation Principle –Personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with Principle 3 except: –a) with the consent of the data subject; or –b) by the authority of law. 4Topic 21: Data Privacy

5 OECD Privacy Principles 5. Security Safeguards Principle –Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure of data. 6. Openness Principle –There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller. 5Topic 21: Data Privacy

6 OECD Privacy Principles 7. Individual Participation Principle –An individual should have the right: –a) to request to know whether or not the data controller has data relating to him; –b) to request data relating to him, … –c) to be given reasons if a request is denied; and –d) to request the data to be rectified, completed or amended. 8. Accountability Principle –A data controller should be accountable for complying with measures which give effect to the principles stated above. 6Topic 21: Data Privacy

7 Areas of Privacy Anonymity –Anonymous communication: e.g., The TOR software to defend against traffic analysis Web privacy –Understand/control what web sites collect, maintain regarding personal data Mobile data privacy, e.g., location privacy Privacy-preserving data usage Topic 21: Data Privacy7

8 Privacy Preserving Data Sharing The need to sharing data –For research purposes E.g., social, medical, technological, etc. –Mandated by laws and regulations E.g., census –For security/business decision making E.g., network flow data for Internet-scale alert correlation –For system testing before deployment –… However, publishing data may result in privacy violations 8Topic 21: Data Privacy

9 GIC Incidence [Sweeny 2002] Group Insurance Commissions (GIC, Massachusetts) –Collected patient data for ~135,000 state employees. –Gave to researchers and sold to industry. –Medical record of the former state governor is identified. Patient 1 Patient 2 Patient n GIC, MA DB …… AgeSexZip codeDisease 69M47906Cancer 65M47907Cancer 52F47902Flu 43F46204Gastritis 42F46208Hepatitis 47F46203Bronchitis Name Bob Carl Daisy Emily Flora Gabriel 9 Re-identification occurs! Topic 21: Data Privacy

10 AOL Data Release [NYTimes 2006] In August 2006, AOL Released search keywords of 650,000 users over a 3-month period. –User IDs are replaced by random numbers. –3 days later, pulled the data from public access. 10 “landscapers in Lilburn, GA” queries on last name “Arnold” “homes sold in shadow lake subdivision Gwinnett County, GA” “num fingers” “60 single men” “dog that urinates on everything” Thelman Arnold, a 62 year old widow who lives in Liburn GA, has three dogs, frequently searches her friends’ medical ailments. AOL searcher # 4417749 NYT Re-identification occurs! Topic 21: Data Privacy

11 Netflix Movie Rating Data [Narayanan and Shmatikov 2009] Netflix released anonymized movie rating data for its Netflix challenge –With date and value of movie ratings Knowing 6-8 approximate movie ratings and dates is able to uniquely identify a record with over 90% probability –Correlating with a set of 50 users from imdb.com yields two records Netflix cancels second phase of the challenge 11 Re-identification occurs! Topic 21: Data Privacy

12 Genome-Wide Association Study (GWAS) [Homer et al. 2008] A typical study examines thousands of singe-nucleotide polymorphism locations (SNPs) in a given population of patients for statistical links to a disease. From aggregated statistics, one individual’s genome, and knowledge of SNP frequency in background population, one can infer participation in the study. –The frequency of every SNP gives a very noisy signal of participation; combining thousands of such signals give high- confidence prediction 12 Membership disclosure occurs! Stud y grou p Avg Popu latio n Avg Targe t indivi dual SNP 1=A 43%42%yes SNP 2=A 11% no SNP 3=A 58%59%no SNP 4=A 23%22%yes … … Topic 21: Data Privacy

13 Need for Data Privacy Research Identification Disclosure (GIC, AOL, Netflix) –Leaks the subject individual of one record Attribute Disclosure –leaks more precise information about the attribute values of some individual Membership Disclosure (GWAS) –leaks an individual’s participation is in the dataset Research Program: Develop theory and techniques to anonymize data so that they can be beneficially used without privacy violations. How to define privacy for anonymized data? How to publish data to satisfy privacy while providing utility? 13Topic 21: Data Privacy

14 11 k-Anonymity [Sweeney, Samarati ]  Privacy is “protection from being brought to the attention of others.”  k-Anonymity Each record is indistinguishable from  k-1 other records when only “quasi-identifiers” are considered These k records form an equivalence class  To achieve k-Anonymity, uses Generalization: Replace with less-specific values Suppression: Remove outliers Male Female * 476** 47677 4767847602 2* 29 2722 Zipcode AgeGender Topic 21: Data Privacy

15 12 Example of k-Anonymity & Generalization QIDSA ZipcodeAgeGenDisease 4767729FOvarian Cancer 4760222FOvarian Cancer 4767827MProstate Cancer 4790543MFlu 4790952FHeart Disease 4790647MHeart Disease QIDSA ZipcodeAgeGenDisease 476** 2* ****** Ovarian Cancer Prostate Cancer 4790* [43,52] ****** Flu Heart Disease The MicrodataThe Generalized Table  3-Anonymous table The adversary knows Alice’s QI values (47677, 29, F) The adversary does not know which one of the first 3 records corresponds to Alice’s record. Topic 21: Data Privacy

16 16 Attacks on k-Anonymity ZipcodeAgeDisease 476** 2* Heart Disease 4790* ≥40 Flu Heart Disease Cancer 476** 3* Heart Disease Cancer A 3-anonymous patient table Bob ZipcodeAge 4767827 Carl ZipcodeAge 4767336  k-anonymity does not provide privacy if: Sensitive values lack diversity The attacker has background knowledge Homogeneity Attack Background Knowledge Attack Carl does not have heart disease Topic 21: Data Privacy

17 17  l –Diversity: [Machanavajjhala et al. 2006] Principle –Each equi-class contains at least l well-represented sensitive values Instantiation –Distinct l -diversity Each equi-class contains l distinct sensitive values –Entropy l -diversity entropy(equi-class)≥log 2 ( l ) Topic 21: Data Privacy

18 18 The Skewness Attack on l- Diversity l-diversity does not consider the overall distribution of sensitive values  Two values for the sensitive attribute HIV positive (1%) and HIV negative (99%)  Highest diversity still has serious privacy risk Consider an equi-class that contains an equal number of positive records and negative records.  l-diversity does not differentiate: Equi-class 1: 49 positive + 1 negative Equi-class 2: 1 positive + 49 negative Topic 21: Data Privacy

19 19 The Similarity Attack on l- Diversity Bob ZipAge 4767827 ZipcodeAgeSalaryDisease 476** 2* 20K 30K 40K Gastric Ulcer Gastritis Stomach Cancer 4790* ≥40 50K 100K 70K Gastritis Flu Bronchitis 476** 3* 60K 80K 90K Bronchitis Pneumonia Stomach Cancer A 3-diverse patient table Conclusion 1.Bob’s salary is in [20k,40k], which is relative low. 2.Bob has some stomach-related disease. l-diversity does not consider semantic meanings of sensitive values Topic 21: Data Privacy

20 20 t-Closeness Principle: Distribution of sensitive attribute value in each equi-class should be close to that of the overall dataset (distance  t) –Assuming that publishing a completely generalized table is always acceptable –We use Earth Mover Distance to capture semantic relationship among sensitive attribute values (n,t)-closeness: Distribution of sensitive attribute value in each equi-class should be close to that of some natural super-group consisting at least n tuples N. Li, T. Li, S. Venkatasubramanian: t- Closeness: Privacy Beyond k- Anonymity and l-diversity. In ICDE 2007. Journal version in TKDE 2010. Topic 21: Data Privacy

21 From Syntactical Privacy Notions to Differential Privacy Limitation of previous privacy notions: –Requires identifying which attributes are quasi-identifier or sensitive, not always possible –Difficult to pin down due to background knowledge –Syntactic in nature (property of anonymized dataset) Not exhaustive in inference prevented Differential Privacy [Dwork et al. 2006] –Privacy is not violated is one’s information is not included –Output does not overly depend on any single tuple 21Topic 21: Data Privacy

22 Differential Privacy [Dwork et al. 2006] 223/6/2016

23 The Desire to Have a Semantic Interpretation of DP Why needs a semantic interpretation? –“Definition [of DP] equates privacy with the inability to distinguish two close databases. Indistinguishability is a convenient notion to work with; However, it does not directly say what an adversary may do and learn.” [Dwork et al. 2006: “Calibrating Noises …”]. What makes a semantic interpretation? –A Bayesian approach: An adversary has a prior belief; after interacting with A(D), the adversary has a posterior belief. We want to place some limitation on the posterior belief 3/6/201623

24 Impossibility of Bounding Arbitrary Prior-to-Posterior Belief Change Dalenius [in 1977] proposes this as privacy notion: “Access to a statistical database should not enable one to learn anything about an individual that could not be learned without access.” –Similar to the notion of semantic security for encryption –Not possible in the context if one wants utility. The Terry’s height example: –Adversary knows “Terry is two inches shorter than the average Lithuanian woman” –Published data reveal average height of Lithuanian woman –Seeing published info enable learning Terry’s height 3/6/201624

25 Different Manifestation of the Impossibility Result Dwork & Naor: “absolute disclosure prevention (while preserving utility at the same time) is impossible because of the arbitrary auxiliary information the adversary may have”. Kifer and Machanavajjhala: “achieving both utility and privacy is impossible without making assumptions about the data.” Li et al. (Membership privacy framework): “without restricting the adversary’s prior belief about the dataset distribution, achieving privacy requires publishing essentially the same information for two arbitrary datasets” 3/6/201625

26 What to do now for Semantic Interpretation of DP? Approach 1: Provide posterior-to-posterior bound –Identify “ideal worlds” where individuals’ privacy are preserved: the i’th ideal world is to remove the i’th individual’s data –Bound difference between ``ideal worlds’’ and the “real world” Approach 2: Understand under which condition prior-to-posterior bound can be ensured 3/6/201626

27 An Alternative Formulation 3/6/201627

28 DP’s Similar-Decision-Regardless-of- Prior Guarantee Regardless of external knowledge, an adversary with access to the sanitized database makes similar decisions whether or not one individual’s data is included in the original database. 3/6/201628

29 The Personal Data Principle Data privacy means giving an individual control over his or her personal data. An individual's privacy is not violated if no personal data about the individual is used. Privacy does not mean that no information about the individual is learned, or no harm is done to an individual; enforcing the latter is infeasible and unreasonable. 3/6/201629

30 An Attempt at Providing Prior-to- Posterior Bound in [Dword et al. 2006] A mechanism is said to be (k,  )-simulatable if for every informed adversary who already knows all except for k entries in the dataset D, every output, and every predicate f, the change in the adversary's belief on f is multiplicative-bounded by e . Thm:  -DP is equivalent to (1,  )-simulatable. Does this mean  -DP provides prior-to-posterior bound for an arbitrary adversary? –Wouldn’t that conflict with the impossibility results? 3/6/201630

31 Counter-Arguments From [Kifer and Machanavajjhala, 2011] “Additional popularized claims have been made about the privacy guarantees of differential privacy. These include: –It makes no assumptions about how data are generated. –It protects an individual’s information (even) if an attacker knows about all other individuals in the data. –It is robust to arbitrary background knowledge.” 3/6/201631

32 An Example Adapted from [Kifer and Machanavajjhala, 2011] Bob or one of his 9 immediate family members may have contracted a highly contagious disease, in which case the entire family would have been infected. An adversary asks the query “how many people at Bob's family address have this disease?” What can be learned from an answer produced while satisfying  -DP? –Answer: Adversary’s belief change on Bob’s disease status may change by something close to e 10 . Anything wrong here? 3/6/201632

33 First, The Technical Aspects 1.An adversary’s belief about Bob’s disease status may change by a factor of e 10 due to data correlation. This is an example that DP cannot bound prior-to-posterior belief change against arbitrary external knowledge. 2.DP’s guarantee about posterior-to-posterior bound remains valid. 3.The analysis in [Dwork et al. 2006] is potentially misleading, because it could lead one to think that DP can offer more protection than it actually does. –The notion of informed adversary, while appearing strong, is in fact, very limiting. 3/6/201633

34 Still, Does -DP provide the intended level of privacy protection? It is complicated… Consider the following variants. Case (a). Bob lives in a dorm building with 9 other unrelated individuals. Either they all have the disease or none. One can query how many individuals at this address have the disease. Case (b). The original example: Bob and 9 family members. Case (c). Bob and 9 minors for which Bob is the legal guardian. Case (d). DP is defined over records, each record corresponds to a single visit; Bob may have 10 visits. 3/6/201634

35 Our Answer Case (a). Bob and 9 other unrelated individuals. –DP does what it suppose to do based on Personal Data Principle. Case (b). The original example: Bob and 9 family members. –Difficult to say: on the borderline and not enough information. Case (c). Bob and 9 minors –Using DP this way is inappropriate, because Bob controls the 9 other records as well, and Case (d). DP is defined over records, each record corresponds to a single visit; Bob may have 10 visits. –Using DP this way is inappropriate. 3/6/201635 1.  -DP does not prevent inference of personal info when there is correlation. 2.Whether this acceptable or not depends on whether definition of neighboring datasets simulates opting-out correctly.

36 When is -DP Good Enough? Applying -DP in a particular setting provides sufficient privacy guarantee when the following three conditions hold: –(1) The Personal Data Principle can be applied; –(2) All data one individual controls are included in the difference of two neighboring datasets; With (1) and (2), even if some information about an individual is learned because of correlation, one can defend DP. –(3) An appropriate value is used. 3/6/201636

37 How to Choose  From the inventors of DP: “The choice of is essentially a social question. We tend to think of as, say, 0.01, 0.1, or in some cases, ln 2 or ln 3”. Our position. – of between 0.1 and 1 is often acceptable – close to 5 might be applicable in rare cases, but needs careful analysis – above 10 means very little Why? 3/6/201637

38 Consult This Table of Change in Belief: p is prior; numbers in table are posterior 3/6/201638

39 On Defining Neighbors Incorrectly Edge-DP in graph data is inappropriate –Typically one individual controls a node and its relationship. –``Attacks’’ on graph anonymization typically in the form of node identification. –Suppose the goal is to protect edge info, then edge-DP still fails, because of correlation between edges. Packet-level privacy for networking data is inappropriate Cell-level privacy in matrix data is usually inappropriate Google’s RAPPOR system is not good enough –Data collection views answer to each question separately, the same  is applied to each question 3/6/201639

40 Apply a Model Learned with DP Arbitrarily. There are two steps in Big Data –Learning a model from data from individuals in A –Apply the model to individuals in B, using some (typically less sensitive) personal info of each individual, one can learn (typically more sensitive) personal info. The sets A and B may overlap The notion of DP deals with only the first step. Even if a model is learned while satisfying DP, applying it may still result in privacy concern, because it uses each individual’s personal info. 3/6/201640

41 The Target Pregnancy Prediction Example Target assigns every customer a Guest ID number and stores a history of everything they've bought and any demographic information Target has collected from them or bought from other sources. Looking at historical buying data for all the ladies who had signed up for Target baby registries in the past, Target's algorithm was able to identify about 25 products that, when analyzed together, allowed Target to assign each shopper a ``pregnancy prediction'' score. Target could also estimate her due date to within a small window, so Target could send coupons timed to very specific stages of her pregnancy. 3/6/201641

42 Potential Challenge to Personal Data Principle Can a group of individuals, none of whom has specifically authorized usage of their personal information, together sue on privacy grounds that aggregate information about the individual is leaked? –If so, satisfying DP is not sufficient. –Would size of group matter? Probably? 3/6/201642

43 Topic 21: Data Privacy43 Coming Attractions … Role-Based Access Control


Download ppt "Topic 21: Data Privacy1 Information Security CS 526 Topic 21: Data Privacy."

Similar presentations


Ads by Google