Information Flow.

Slides:



Advertisements
Similar presentations
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Advertisements

Information Flow and Covert Channels November, 2006.
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #30-1 Chapter 30: Lattices Overview Definitions Lattices Examples.
Lecture Discrete Probability. 5.3 Bayes’ Theorem We have seen that the following holds: We can write one conditional probability in terms of the.
1. Probability of an Outcome 2. Experimental Probability 3. Fundamental Properties of Probabilities 4. Addition Principle 5. Inclusion-Exclusion Principle.
Probability Three basic types of probability: Probability as counting
Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
1 Information Flow CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute April 22, 2004.
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
November 1, 2004Introduction to Computer Security ©2004 Matt Bishop Slide #15-1 Chapter 15: Information Flow Definitions Compiler-based mechanisms Execution-based.
July 1, 2004Computer Security: Art and Science © Matt Bishop Slide #16-1 Chapter 16: Information Flow Entropy and analysis Non-lattice information.
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-1 Chapter 32: Entropy and Uncertainty Conditional, joint probability Entropy.
LING 438/538 Computational Linguistics Sandiway Fong Lecture 17: 10/24.
Bell Work: Factor x – 6x – Answer: (x – 8)(x + 2)
Conditional Probability
Information Theory and Security
Chapter6 Jointly Distributed Random Variables
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture 22April 1, 2004Carnegie Mellon University
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
Counting and Probability. Counting Elements of Sets Theorem. The Inclusion/Exclusion Rule for Two or Three Sets If A, B, and C are finite sets, then N(A.
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
Chapter 9 Review. 1. Give the probability of each outcome.
1 CHAPTERS 14 AND 15 (Intro Stats – 3 edition) PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
Discrete Mathematics Math 6A Homework 6 Solution.
1 IS 2150 / TEL 2810 Introduction to Security James Joshi Associate Professor, SIS Lecture 3 September 15, 2009 Mathematical Review Security Policies.
Advanced Precalculus Advanced Precalculus Notes 12.3 Probability.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
3.4 Elements of Probability. Probability helps us to figure out the liklihood of something happening. The “something happening” is called and event. The.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Basic Probabilities Starting Unit 6 Today!. Definitions  Experiment – any process that generates one or more observable outcomes  Sample Space – set.
Counting and Probability. Imagine tossing two coins and observing whether 0, 1, or 2 heads are obtained. Below are the results after 50 tosses Tossing.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
Introduction to Discrete Probability
ICS 253: Discrete Structures I
Chapter 6: Discrete Probability
Chapter 4 Probability Concepts
Applied Discrete Mathematics Week 11: Relations
PROBABILITY AND PROBABILITY RULES
What is Probability? Quantification of uncertainty.
Chapter 15. Information Flow
CS104:Discrete Structures
Conditional Probability
Chapter 32: Entropy and Uncertainty
IS 2150 / TEL 2810 Information Security & Privacy
Probability Review 11/22/2018.
A Brief Introduction to Information Theory
10.1 Notes Need to grab a packet from the back for notes
Xuan Guo Lab 12 Xuan Guo
Computer Security: Art and Science
Chapter 16: Information Flow
IS 2150 / TEL 2810 Introduction to Security
Probability Review 2/24/2019.
Computer Security: Art and Science, 2nd Edition
Entropy and Uncertainty
Chapter 5: Confidentiality Policies
Computer Security: Art and Science, 2nd Edition
Introduction to Computer Security
Applied Discrete Mathematics Week 12: Discrete Probability
Lectures prepared by: Elchanan Mossel Yelena Shvets
IS 2150 / TEL 2810 Introduction to Security
Quick Review of Probability
Random Variables and Probability Distributions
Lecture 2 Basic Concepts on Probability (Section 0.2)
IS 2150 / TEL 2810 Information Security & Privacy
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

Information Flow

Overview Entropy and Uncertainty Information Flow Models Confinement Flow Model Compiler-Based Mechanisms

Entropy Uncertainty of a value, as measured in bits Example: X value of fair coin toss; X could be heads or tails, so 1 bit of uncertainty Therefore entropy of X is H(X) = 1 Formal definition: random variable X, values x1, …, xn; so i p(X = xi) = 1 H(X) = –i p(X = xi) lg p(X = xi) Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Heads or Tails? H(X) = – p(X=heads) lg p(X=heads) – p(X=tails) lg p(X=tails) = – (1/2) lg (1/2) – (1/2) lg (1/2) = – (1/2) (–1) – (1/2) (–1) = 1 Consistent with intuitive result Computer Security: Art and Science ©2002-2004 Matt Bishop

n-Sided Fair Die H(X) = –i p(X = xi) lg p(X = xi) As p(X = xi) = 1/n, this becomes H(X) = –i (1/n) lg (1/ n) = –n(1/n) (–lg n) so H(X) = lg n which is the number of bits in n, as expected Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Ann, Pam, and Paul Ann, Pam twice as likely to win as Paul W represents the winner. What is its entropy? w1 = Ann, w2 = Pam, w3 = Paul p(W= w1) = p(W= w2) = 2/5, p(W= w3) = 1/5 So H(W) = –i p(W = wi) lg p(W = wi) = – (2/5) lg (2/5) – (2/5) lg (2/5) – (1/5) lg (1/5) = – (4/5) + lg 5 ≈ –1.52 If all equally likely to win, H(W) = lg 3 = 1.58 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Joint Entropy X takes values from { x1, …, xn } i p(X=xi) = 1 Y takes values from { y1, …, ym } i p(Y=yi) = 1 Joint entropy of X, Y is: H(X, Y) = –j i p(X=xi, Y=yj) lg p(X=xi, Y=yj) Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Example X: roll of fair die, Y: flip of coin p(X=1, Y=heads) = p(X=1) p(Y=heads) = 1/12 As X and Y are independent H(X, Y) = –j i p(X=xi, Y=yj) lg p(X=xi, Y=yj) = –2 [ 6 [ (1/12) lg (1/12) ] ] = lg 12 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Conditional Entropy X takes values from { x1, …, xn } i p(X=xi) = 1 Y takes values from { y1, …, ym } i p(Y=yi) = 1 Conditional entropy of X given Y=yj is: H(X | Y=yj) = –i p(X=xi | Y=yj) lg p(X=xi | Y=yj) Conditional entropy of X given Y is: H(X | Y) = –j p(Y=yj) i p(X=xi | Y=yj) lg p(X=xi | Y=yj) Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Example X roll of red die, Y sum of red, blue roll Note p(X=1|Y=2) = 1, p(X=i|Y=2) = 0 for i ≠ 1 If the sum of the rolls is 2, both dice were 1 H(X|Y=2) = –i p(X=xi|Y=2) lg p(X=xi|Y=2) = 0 Note p(X=i,Y=7) = 1/6 If the sum of the rolls is 7, the red die can be any of 1, …, 6 and the blue die must be 7–roll of red die H(X|Y=7) = –i p(X=xi|Y=7) lg p(X=xi|Y=7) = –6 (1/6) lg (1/6) = lg 6 Computer Security: Art and Science ©2002-2004 Matt Bishop

Overview Entropy and Uncertainty Information Flow Models Confinement Flow Model Compiler-Based Mechanisms

Bell-LaPadula Model Information flows from A to B iff B dom A TS{R,P} TS{P} TS{R} S{R} S{P} S{}

Entropy-Based Analysis Question: Can we learn something about the value of x by observing its effect on y? If so, then information flows from x to y.

Entropy-Based Analysis Consider a command sequence that takes a system from state s to state t xs is the value of x at state s (likewise for xy, ys, yt) H(a | b) is the uncertainty of a given b Def: A command sequence causes a flow of information from x to y if H(xs | yt) < H(xs | ys). Note: If y does not exist in s, then H(xs | ys) = H(xs)

Example Flows y := x H(xs | yt) = 0 tmp := x; y := tmp;

Another Example if (x==1) then y:= 0 else y := 1 Suppose x is equally likely to be 0 or 1, so H(xs) = 1 But, H(xs | yt) = 0 So, H(xs | yt) < H(xs | ys) = H(xs) Thus, information flows from x to y. Def. An implicit flow of information occurs when information flows from x to y without an explicit assignment of the form y := f(x)

Requirements for Information Flow Models Reflexivity: information should flow freely among members of a class Transitivity: If b reads something from c and saves it, and if a reads from b, then a can read from c A lattice has a relation R that is reflexive and transitive (and antisymmetric)

Information Flow Models An Information flow policy I is a triple I = (SCI, I, joinI), where SCI is a set of security classes, I is an ordering relation on the elements of SCI, and joinI combines two elements of SCI Example: Bell-LaPadula has security compartments for SCI, dom for I and lub as joinI

Overview Entropy and Uncertainty Information Flow Models Confinement Flow Model Compiler-Based Mechanisms

Confinement Flow Model Associate with each object x a security class x Def: The confinement flow model is a 4-tuple (I, O, confine, ) in which I = (SCI, I, join I) is a lattice-based info. flow policy O is a set of entities  : O  O is a relation with (a, b)   iff information can flow from a to b for each a  O, confine(a) is a pair (aL, aU)  SCI  SCI, with aL I aU if x I aU then information can flow from x to a if aL I x the information can flow from a to x

Example Confinement Model Let a, b, and c  O confine(a) = [CONFIDENTIAL, CONFIDENTIAL] confine(b) = [SECRET, SECRET] confine(c) = [TOPSECRET, TOPSECRET] Then a  b, a  c, and b  c are the legal flows

Another Example Let a, b, and c  O confine(a) = [CONFIDENTIAL, CONFIDENTIAL] confine(b) = [SECRET, SECRET] confine(c) = [CONFIDENTIAL, TOPSECRET] Then a  b, a  c, b  c , c  a , and c  b are the legal flows Note that b  c and c  a, but information cannot flow from b to a because bL I aU is false So, transitivity fails to hold

Overview Entropy and Uncertainty Information Flow Models Confinement Flow Model Compiler-Based Mechanisms

Complier-Based Mechanisms Assignment statements Compound statements Conditional statements Iterative statements

Assignment Statements y := f(x1, ..., xn) Requirement for information flow to be secure is: lub {x1, ..., xn}  y Example: x := y + z; lub{y, z}  x

Compound Statements begin S1; ... Sn; end; Requirement for information flow to be secure: S1 secure AND ... AND Sn secure

Conditional Statements if f(x1, ..., xn) then S1; else S2; end; Requirement for information flow to be secure: S1 secure AND S2 secure AND lub{x1, ..., xn}  glb{y | y is the target of an assignment in S1 or S2}

Example Conditional Statement if x + y < z then a := b; else d := b * c - x; end; b  a for S1 lub{b, c, x}  d for S2 lub{x, y, z}  glb{a, d} for condition

Iterative Statements while f(x1, ..., xn) do S; Requirement for information flow to be secure: Iteration terminates S secure lub{x1, ..., xn}  glb{y | y is the target of an assignment in S}

Example Iteration Statement while i < n do begin a[i] := b[i]; /* S1 */ i := i + 1; /* S2 */ end; Loop must terminate (which it does) Body must be secure lub{i, b[i]}  a[i] for S1 i  i for S2 (so S2 is secure) lub{i, b[i]}  a[i] for whole body (compound statement) lub{i, n}  glb{a[i], i} must hold Requirements can be combined – homework