Presentation is loading. Please wait.

Presentation is loading. Please wait.

Information Flow.

Similar presentations


Presentation on theme: "Information Flow."— Presentation transcript:

1 Information Flow

2 Overview Entropy and Uncertainty Information Flow Models
Confinement Flow Model Compiler-Based Mechanisms

3 Entropy Uncertainty of a value, as measured in bits
Example: X value of fair coin toss; X could be heads or tails, so 1 bit of uncertainty Therefore entropy of X is H(X) = 1 Formal definition: random variable X, values x1, …, xn; so i p(X = xi) = 1 H(X) = –i p(X = xi) lg p(X = xi) Computer Security: Art and Science © Matt Bishop

4 Computer Security: Art and Science
Heads or Tails? H(X) = – p(X=heads) lg p(X=heads) – p(X=tails) lg p(X=tails) = – (1/2) lg (1/2) – (1/2) lg (1/2) = – (1/2) (–1) – (1/2) (–1) = 1 Consistent with intuitive result Computer Security: Art and Science © Matt Bishop

5 n-Sided Fair Die H(X) = –i p(X = xi) lg p(X = xi)
As p(X = xi) = 1/n, this becomes H(X) = –i (1/n) lg (1/ n) = –n(1/n) (–lg n) so H(X) = lg n which is the number of bits in n, as expected Computer Security: Art and Science © Matt Bishop

6 Computer Security: Art and Science
Ann, Pam, and Paul Ann, Pam twice as likely to win as Paul W represents the winner. What is its entropy? w1 = Ann, w2 = Pam, w3 = Paul p(W= w1) = p(W= w2) = 2/5, p(W= w3) = 1/5 So H(W) = –i p(W = wi) lg p(W = wi) = – (2/5) lg (2/5) – (2/5) lg (2/5) – (1/5) lg (1/5) = – (4/5) + lg 5 ≈ –1.52 If all equally likely to win, H(W) = lg 3 = 1.58 Computer Security: Art and Science © Matt Bishop

7 Computer Security: Art and Science
Joint Entropy X takes values from { x1, …, xn } i p(X=xi) = 1 Y takes values from { y1, …, ym } i p(Y=yi) = 1 Joint entropy of X, Y is: H(X, Y) = –j i p(X=xi, Y=yj) lg p(X=xi, Y=yj) Computer Security: Art and Science © Matt Bishop

8 Computer Security: Art and Science
Example X: roll of fair die, Y: flip of coin p(X=1, Y=heads) = p(X=1) p(Y=heads) = 1/12 As X and Y are independent H(X, Y) = –j i p(X=xi, Y=yj) lg p(X=xi, Y=yj) = –2 [ 6 [ (1/12) lg (1/12) ] ] = lg 12 Computer Security: Art and Science © Matt Bishop

9 Computer Security: Art and Science
Conditional Entropy X takes values from { x1, …, xn } i p(X=xi) = 1 Y takes values from { y1, …, ym } i p(Y=yi) = 1 Conditional entropy of X given Y=yj is: H(X | Y=yj) = –i p(X=xi | Y=yj) lg p(X=xi | Y=yj) Conditional entropy of X given Y is: H(X | Y) = –j p(Y=yj) i p(X=xi | Y=yj) lg p(X=xi | Y=yj) Computer Security: Art and Science © Matt Bishop

10 Computer Security: Art and Science
Example X roll of red die, Y sum of red, blue roll Note p(X=1|Y=2) = 1, p(X=i|Y=2) = 0 for i ≠ 1 If the sum of the rolls is 2, both dice were 1 H(X|Y=2) = –i p(X=xi|Y=2) lg p(X=xi|Y=2) = 0 Note p(X=i,Y=7) = 1/6 If the sum of the rolls is 7, the red die can be any of 1, …, 6 and the blue die must be 7–roll of red die H(X|Y=7) = –i p(X=xi|Y=7) lg p(X=xi|Y=7) = –6 (1/6) lg (1/6) = lg 6 Computer Security: Art and Science © Matt Bishop

11 Overview Entropy and Uncertainty Information Flow Models
Confinement Flow Model Compiler-Based Mechanisms

12 Bell-LaPadula Model Information flows from A to B iff B dom A TS{R,P}
TS{P} TS{R} S{R} S{P} S{}

13 Entropy-Based Analysis
Question: Can we learn something about the value of x by observing its effect on y? If so, then information flows from x to y.

14 Entropy-Based Analysis
Consider a command sequence that takes a system from state s to state t xs is the value of x at state s (likewise for xy, ys, yt) H(a | b) is the uncertainty of a given b Def: A command sequence causes a flow of information from x to y if H(xs | yt) < H(xs | ys). Note: If y does not exist in s, then H(xs | ys) = H(xs)

15 Example Flows y := x H(xs | yt) = 0 tmp := x; y := tmp;

16 Another Example if (x==1) then y:= 0 else y := 1
Suppose x is equally likely to be 0 or 1, so H(xs) = 1 But, H(xs | yt) = 0 So, H(xs | yt) < H(xs | ys) = H(xs) Thus, information flows from x to y. Def. An implicit flow of information occurs when information flows from x to y without an explicit assignment of the form y := f(x)

17 Requirements for Information Flow Models
Reflexivity: information should flow freely among members of a class Transitivity: If b reads something from c and saves it, and if a reads from b, then a can read from c A lattice has a relation R that is reflexive and transitive (and antisymmetric)

18 Information Flow Models
An Information flow policy I is a triple I = (SCI, I, joinI), where SCI is a set of security classes, I is an ordering relation on the elements of SCI, and joinI combines two elements of SCI Example: Bell-LaPadula has security compartments for SCI, dom for I and lub as joinI

19 Overview Entropy and Uncertainty Information Flow Models
Confinement Flow Model Compiler-Based Mechanisms

20 Confinement Flow Model
Associate with each object x a security class x Def: The confinement flow model is a 4-tuple (I, O, confine, ) in which I = (SCI, I, join I) is a lattice-based info. flow policy O is a set of entities  : O  O is a relation with (a, b)   iff information can flow from a to b for each a  O, confine(a) is a pair (aL, aU)  SCI  SCI, with aL I aU if x I aU then information can flow from x to a if aL I x the information can flow from a to x

21 Example Confinement Model
Let a, b, and c  O confine(a) = [CONFIDENTIAL, CONFIDENTIAL] confine(b) = [SECRET, SECRET] confine(c) = [TOPSECRET, TOPSECRET] Then a  b, a  c, and b  c are the legal flows

22 Another Example Let a, b, and c  O
confine(a) = [CONFIDENTIAL, CONFIDENTIAL] confine(b) = [SECRET, SECRET] confine(c) = [CONFIDENTIAL, TOPSECRET] Then a  b, a  c, b  c , c  a , and c  b are the legal flows Note that b  c and c  a, but information cannot flow from b to a because bL I aU is false So, transitivity fails to hold

23 Overview Entropy and Uncertainty Information Flow Models
Confinement Flow Model Compiler-Based Mechanisms

24 Complier-Based Mechanisms
Assignment statements Compound statements Conditional statements Iterative statements

25 Assignment Statements
y := f(x1, ..., xn) Requirement for information flow to be secure is: lub {x1, ..., xn}  y Example: x := y + z; lub{y, z}  x

26 Compound Statements begin S1; ... Sn; end;
Requirement for information flow to be secure: S1 secure AND ... AND Sn secure

27 Conditional Statements
if f(x1, ..., xn) then S1; else S2; end; Requirement for information flow to be secure: S1 secure AND S2 secure AND lub{x1, ..., xn}  glb{y | y is the target of an assignment in S1 or S2}

28 Example Conditional Statement
if x + y < z then a := b; else d := b * c - x; end; b  a for S1 lub{b, c, x}  d for S2 lub{x, y, z}  glb{a, d} for condition

29 Iterative Statements while f(x1, ..., xn) do S;
Requirement for information flow to be secure: Iteration terminates S secure lub{x1, ..., xn}  glb{y | y is the target of an assignment in S}

30 Example Iteration Statement
while i < n do begin a[i] := b[i]; /* S1 */ i := i + 1; /* S2 */ end; Loop must terminate (which it does) Body must be secure lub{i, b[i]}  a[i] for S1 i  i for S2 (so S2 is secure) lub{i, b[i]}  a[i] for whole body (compound statement) lub{i, n}  glb{a[i], i} must hold Requirements can be combined – homework


Download ppt "Information Flow."

Similar presentations


Ads by Google