Download presentation
Presentation is loading. Please wait.
Published byAva Merriman Modified over 10 years ago
1
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15, 2010
2
Goal Information-theoretic Quantification of programs impact on Integrity of Information 2 [Denning 1982]
3
What is Integrity? Common Criteria: Protection of assets from unauthorized modification Biba (1977): Guarantee that a subsystem will perform as it was intended Isolation necessary for protection from subversion Dual to confidentiality Databases: Constraints that relations must satisfy Provenance of data Utility of anonymized data …no universal definition 3
4
Our Notions of Integrity 4 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity
5
Our Notions of Integrity 5 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output …distinct, but interact
6
Contamination Goal: model taint analysis 6 Program User Attacker User Attacker trusted untrusted
7
Contamination Goal: model taint analysis Untrusted input contaminates trusted output 7 Program User Attacker User Attacker trusted untrusted
8
Contamination u contaminates o 8 o:=(t,u)
9
Contamination u contaminates o (Cant u be filtered from o ?) 9 o:=(t,u)
10
Quantification of Contamination Use information theory: information is surprise X, Y, Z: distributions I (X,Y): mutual information between X and Y (in bits) I (X,Y | Z): conditional mutual information 10
11
Quantification of Contamination 11 Program User Attacker User Attacker trusted untrusted
12
Quantification of Contamination 12 Program User Attacker User Attacker trusted untrusted U in T in T out
13
Quantification of Contamination Contamination = I (U in,T out | T in ) 13 Program User Attacker User Attacker trusted untrusted U in T in T out [Newsome et al. 2009] Dual of [Clark et al. 2005, 2007]
14
Example of Contamination 14 o:=(t,u) Contamination = I (U, O | T) = k bits if U is uniform on [0,2 k -1]
15
Our Notions of Integrity 15 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output
16
Program Suppression Goal: model program (in)correctness 16 Sender Receiver correct Specification (Specification must be deterministic)
17
Program Suppression Goal: model program (in)correctness 17 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification Sender Receiver correct real
18
Program Suppression Goal: model program (in)correctness 18 Implementation Sender Attacker Receiver Attacker trusted untrusted Sender Receiver Implementation might suppress information about correct output from real output real correct Specification
19
Example of Program Suppression 19 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } Spec. a[0..m-1]: trusted
20
Example of Program Suppression 20 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } Spec. Impl. 1 Suppressiona[0] missing No contamination a[0..m-1]: trusted
21
Example of Program Suppression 21 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppressiona[0] missing No contamination Suppressiona[m] added Contamination a[0..m-1]: trusted a[m]: untrusted
22
Suppression vs. Contamination 22 * * Attacker * * Contaminatio n Suppression output := input
23
Suppression vs. Contamination 23 n:=rnd(); o:=t xor n n:=rnd(); o:=t xor n o:=t xor u o:=(t,u) t suppressed by noise no contamination t suppressed by u o contaminated by u no suppression o := t
24
Quantification of Program Suppression 24 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification Sender Receiver
25
Quantification of Program Suppression 25 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification In Spec Sender Receiver
26
Quantification of Program Suppression 26 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification U in T in Impl In Spec Sender Receiver
27
Quantification of Program Suppression 27 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification U in T in Program transmission = I (Spec, Impl) In Spec Impl Sender Receiver
28
Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) H (Spec | Impl) 28
29
Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) H (Spec | Impl) 29 Total info to learn about Spec
30
Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) H (Spec | Impl) 30 Info actually learned about Spec by observing Impl Total info to learn about Spec
31
Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) H (Spec | Impl) 31 Info actually learned about Spec by observing Impl Total info to learn about Spec Info NOT learned about Spec by observing Impl
32
Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) H (Spec | Impl) Program Suppression = H (Spec | Impl) 32
33
Example of Program Suppression 33 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppression = H (A)Suppression H (A) A = distribution of individual array elements
34
Echo Specification 34 output := input Sender Receiver trusted T in
35
Echo Specification 35 Implementation Sender Attacker Receiver Attacker trusted untrusted output := input Sender Receiver trusted T in T out U in
36
Echo Specification 36 Implementation Sender Attacker Receiver Attacker trusted untrusted output := input Sender Receiver trusted T in T out U in Simplifies to information-theoretic model of channels, with attacker
37
Channel Suppression 37 Channel transmission = I (T in,T out ) Channel suppression = H (T in | T out ) (T out depends on U in ) Channel Sender Attacker Receiver Attacker trusted untrusted T in T out U in
38
Belief-based Metrics What if users/receivers distribution on unobservable inputs is wrong? Belief-based information flow [Clarkson et al. 2005] Belief-based generalizes information- theoretic: On single executions, the same In expectation, the same …if users/receivers distribution is correct 38
39
Suppression and Confidentiality Declassifier: program that reveals (leaks) some information; suppresses rest Leakage: [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009] Thm. Leakage + Suppression is a constant What isnt leaked is suppressed 39
40
Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacys sake 40 Anonymizer User Databas e User query response anonymized response
41
Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacys sake …suppresses to avoid leakage 41 Anonymizer User Databas e User query response anonymized response anon. resp. := resp.
42
Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacys sake …suppresses to avoid leakage …sacrifices integrity for confidentialitys sake 42 Anonymizer User Databas e User query response anonymized response
43
K-anonymity DB. Every individual must be anonymous within set of size k [Sweeney 2002] Iflow. Every output corresponds to k inputs …no bound on leakage or suppression 43
44
L-diversity DB. Every individuals sensitive information should appear to have L (roughly) equally likely values [Machanavajjhala et al. 2007] DB. Entropy L-diversity: H (anon. block) log L [Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007] Iflow. H (T in | t out ) log L …implies suppression log L 44
45
Differential Privacy DB. [Dwork et al. 2006, Dwork 2006] No individual loses privacy by including data in database Anonymized data set reveals no information about an individual beyond what other individuals already reveal Iflow. Output reveals almost no information about individual input beyond what other inputs already reveal …implies almost all information about individual suppressed …quite similar to noninterference 45
46
Summary Measures of information corruption: Contamination (generalizes taint analysis, dual to leakage) Suppression (generalizes program correctness, no dual) Application: database privacy (model anonymizers; relate utility and privacy; security conditions) 46
47
More Integrity Measures Attacker- and program-controlled suppression Granularity: Average over all executions Single executions Sequences of executions …interaction of attacker with program Applications: Error-correcting codes, malware 47
48
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15, 2010
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.