Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University RADICAL May 10, 2010.

Similar presentations


Presentation on theme: "Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University RADICAL May 10, 2010."— Presentation transcript:

1 Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University RADICAL May 10, 2010

2 Goal Information-theoretic Quantification of programs’ impact on Integrity of Information (relationship to database privacy) Clarkson: Quantification of Integrity2 [Denning 1982]

3 What is Integrity? Databases: Constraints that relations must satisfy Provenance of data Utility of anonymized data Common Criteria: Protection of assets from unauthorized modification Biba (1977): Guarantee that a subsystem will perform as it was intended; Isolation necessary for protection from subversion; Dual to confidentiality …no universal definition Clarkson: Quantification of Integrity3

4 Our Notions of Integrity Clarkson: Quantification of Integrity4 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output …distinct, but interact

5 Contamination Goal: model taint analysis  Untrusted input contaminates trusted output Clarkson: Quantification of Integrity5 Program User Attacker User Attacker trusted untrusted

6 Contamination u contaminates o (Can’t u be filtered from o ?) Clarkson: Quantification of Integrity6 o:=(t,u)

7 Quantification of Contamination Use information theory: information is surprise X, Y, Z: distributions I (X,Y): mutual information between X and Y (in bits) I (X,Y | Z): conditional mutual information Clarkson: Quantification of Integrity7

8 Quantification of Contamination Contamination = I (U in,T out | T in ) Clarkson: Quantification of Integrity8 Program User Attacker User Attacker trusted untrusted U in T in T out [Newsome et al. 2009] Dual of [Clark et al. 2005, 2007]

9 Example of Contamination Clarkson: Quantification of Integrity9 o:=(t,u) Contamination = I (U, O | T) = k bits if U is uniform on [0,2 k -1]

10 Our Notions of Integrity Clarkson: Quantification of Integrity10 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output

11 Program Suppression Goal: model program (in)correctness Clarkson: Quantification of Integrity11 Implementation Sender Attacker Receiver Attacker trusted untrusted Sender Receiver Information about correct output is suppressed from real output real correct Specification

12 Example of Program Suppression Clarkson: Quantification of Integrity12 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppression—a[0] missing No contamination Suppression—a[m] added Contamination a[0..m-1]: trusted

13 Suppression vs. Contamination Clarkson: Quantification of Integrity13 * * Attacker * * Contaminatio n Suppression output := input

14 Quantification of Program Suppression Clarkson: Quantification of Integrity14 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification U in T in Program transmission = I (Spec, Impl) In Spec Impl Sender Receiver

15 Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity15 Info actually learned about Spec by observing Impl Total info to learn about Spec Info NOT learned about Spec by observing Impl

16 Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Program Suppression = H (Spec | Impl) Clarkson: Quantification of Integrity16

17 Example of Program Suppression Clarkson: Quantification of Integrity17 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppression = H (A)Suppression ≤ H (A) A = distribution of individual array elements

18 Suppression and Confidentiality Declassifier: program that reveals (leaks) some information; suppresses rest Leakage: [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009] Thm. Leakage + Suppression is a constant  What isn’t leaked is suppressed Clarkson: Quantification of Integrity18

19 Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake …suppresses to avoid leakage …sacrifices integrity for confidentiality’s sake Clarkson: Quantification of Integrity19 Anonymizer User Databas e User query response anonymized response

20 k-anonymity DB: Every individual must be anonymous within set of size k. [Sweeney 2002] Programs: Every output corresponds to k inputs. But what about background knowledge? …no bound on leakage …no bound on suppression Clarkson: Quantification of Integrity20

21 L-diversity DB: Every individual’s sensitive information should appear to have L (roughly) equally likely values. [Machanavajjhala et al. 2007] Entropy L-diversity: H (anon. block) ≥ log L [Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007] Program: H (T in | t out ) ≥ log L (if T in uniform) …implies suppression ≥ log L Clarkson: Quantification of Integrity21

22 Summary Measures of information corruption: Contamination (generalizes taint analysis) Suppression (generalizes program correctness) Application: database privacy (model anonymizers; relate utility and privacy) Clarkson: Quantification of Integrity22

23 More Integrity Measures  Channel suppression … same as channel model from information theory, but with attacker  Attacker- and program-controlled suppression  Belief-based measures [Clarkson et al. 2005] …generalize information-theoretic measures Granularity:  Average over all executions  Single executions  Sequences of executions Clarkson: Quantification of Integrity23


Download ppt "Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University RADICAL May 10, 2010."

Similar presentations


Ads by Google