June 1, 2004Computer Security: Art and Science ©2002-2004 Matt Bishop Slide #32-1 Chapter 32: Entropy and Uncertainty Conditional, joint probability Entropy.

Slides:



Advertisements
Similar presentations
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Advertisements

Lecture 13 Elements of Probability CSCI – 1900 Mathematics for Computer Science Fall 2014 Bill Pine.
Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
0 0 Review Probability Axioms –Non-negativity P(A)≥0 –Additivity P(A U B) =P(A)+ P(B), if A and B are disjoint. –Normalization P(Ω)=1 Independence of two.
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Games of probability What are my chances?. Roll a single die (6 faces). –What is the probability of each number showing on top? Activity 1: Simple probability:
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
PROBABILITY  A fair six-sided die is rolled. What is the probability that the result is even?
Introduction to Probability
Lec 18 Nov 12 Probability – definitions and simulation.
Joint Distributions, Marginal Distributions, and Conditional Distributions Note 7.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 4-1 Introduction to Statistics Chapter 5 Random Variables.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
Information Theory and Security. Lecture Motivation Up to this point we have seen: –Classical Crypto –Symmetric Crypto –Asymmetric Crypto These systems.
Expected Value.  In gambling on an uncertain future, knowing the odds is only part of the story!  Example: I flip a fair coin. If it lands HEADS, you.
Bell Work: Factor x – 6x – Answer: (x – 8)(x + 2)
Information Theory and Security
Joint Probability distribution
Chapter6 Jointly Distributed Random Variables
What are the chances of that happening?. What is probability? The mathematical expression of the chances that a particular event or outcome will happen.
Warm up: Solve each system (any method). W-up 11/4 1) Cars are being produced by two factories, factory 1 produces twice as many cars (better management)
§1 Entropy and mutual information
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Independence and Dependence 1 Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
From Randomness to Probability
Chapter 5.1 Probability Distributions.  A variable is defined as a characteristic or attribute that can assume different values.  Recall that a variable.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
JHU CS /Jan Hajic 1 Introduction to Natural Language Processing ( ) Essential Information Theory I AI-lab
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
1 Information Theory Nathanael Paul Oct. 09, 2002.
Information Theory Basics What is information theory? A way to quantify information A lot of the theory comes from two worlds Channel.
EXAMPLE 1 Independent and Dependent Events Tell whether the events are independent or dependent. SOLUTION You randomly draw a number from a bag. Then you.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
Probability Distributions, Discrete Random Variables
What is the probability of two or more independent events occurring?
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Lecture 3 Appendix 1 Computation of the conditional entropy.
Probability Quiz. Question 1 If I throw a fair dice 30 times, how many FIVES would I expect to get?
Experiments, Outcomes and Events. Experiment Describes a process that generates a set of data – Tossing of a Coin – Launching of a Missile and observing.
Independent and Dependent Events Lesson 6.6. Getting Started… You roll one die and then flip one coin. What is the probability of : P(3, tails) = 2. P(less.
Probability Distribution. Probability Distributions: Overview To understand probability distributions, it is important to understand variables and random.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Math 1320 Chapter 7: Probability 7.3 Probability and Probability Models.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
16.6 Expected Value.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
Introduction to Discrete Probability
ICS 253: Discrete Structures I
Chapter 6: Discrete Probability
Sec. 4-5: Applying Ratios to Probability
Chapter 6 6.1/6.2 Probability Probability is the branch of mathematics that describes the pattern of chance outcomes.
Chapter 32: Entropy and Uncertainty
PROBABILITY The probability of an event is a value that describes the chance or likelihood that the event will happen or that the event will end with.
Probability Review 11/22/2018.
Probability.
A Brief Introduction to Information Theory
Probability Trees By Anthony Stones.
6.2/6.3 Probability Distributions and Distribution Mean
Probability Review 2/24/2019.
Entropy and Uncertainty
Information Flow.
Entropy CSCI284/162 Spring 2009 GWU.
Probability of TWO EVENTS
A random experiment gives rise to possible outcomes, but any particular outcome is uncertain – “random”. For example, tossing a coin… we know H or T will.
Chapter 11 Probability.
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-1 Chapter 32: Entropy and Uncertainty Conditional, joint probability Entropy and uncertainty Joint entropy Conditional entropy Perfect secrecy

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-2 Overview Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-3 Random Variable Variable that represents outcome of an event –X represents value from roll of a fair die; probability for rolling n: p(X = n) = 1/6 –If die is loaded so 2 appears twice as often as other numbers, p(X = 2) = 2/7 and, for n ≠ 2, p(X = n) = 1/7 Note: p(X) means specific value for X doesn’t matter –Example: all values of X are equiprobable

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-4 Joint Probability Joint probability of X and Y, p(X, Y), is probability that X and Y simultaneously assume particular values –If X, Y independent, p(X, Y) = p(X)p(Y) Roll die, toss coin –p(X = 3, Y = heads) = p(X = 3)p(Y = heads) = 1/6  1/2 = 1/12

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-5 Two Dependent Events X = roll of red die, Y = sum of red, blue die rolls Formula: –p(X=1, Y=11) = p(X=1)p(Y=11) = (1/6)(2/36) = 1/108 p(Y=2) = 1/36p(Y=3) = 2/36p(Y=4) = 3/36p(Y=5) = 4/36 p(Y=6) = 5/36p(Y=7) = 6/36p(Y=8) = 5/36p(Y=9) = 4/36 p(Y=10) = 3/36p(Y=11) = 2/36p(Y=12) = 1/36

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-6 Conditional Probability Conditional probability of X given Y, p(X|Y), is probability that X takes on a particular value given Y has a particular value Continuing example … –p(Y=7|X=1) = 1/6 –p(Y=7|X=3) = 1/6

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-7 Relationship p(X, Y) = p(X | Y) p(Y) = p(X) p(Y | X) Example: –p(X=3,Y =8) = p(X=3|Y =8) p(Y =8) = (1/5)(5/36) = 1/36 Note: if X, Y independent: –p(X|Y) = p(X)

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-8 Entropy Uncertainty of a value, as measured in bits Example: X value of fair coin toss; X could be heads or tails, so 1 bit of uncertainty –Therefore entropy of X is H(X) = 1 Formal definition: random variable X, values x 1, …, x n ; so  i p(X = x i ) = 1 H(X) = –  i p(X = x i ) lg p(X = x i )

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-9 Heads or Tails? H(X) =– p(X=heads) lg p(X=heads) – p(X=tails) lg p(X=tails) =– (1/2) lg (1/2) – (1/2) lg (1/2) = – (1/2) (–1) – (1/2) (–1) = 1 Confirms previous intuitive result

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-10 n-Sided Fair Die H(X) = –  i p(X = x i ) lg p(X = x i ) As p(X = x i ) = 1/n, this becomes H(X) = –  i (1/n) lg (1/ n) = –n(1/n) (–lg n) so H(X) = lg n which is the number of bits in n, as expected

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-11 Ann, Pam, and Paul Ann, Pam twice as likely to win as Paul W represents the winner. What is its entropy? –w 1 = Ann, w 2 = Pam, w 3 = Paul –p(W= w 1 ) = p(W= w 2 ) = 2/5, p(W= w 3 ) = 1/5 So H(W) = –  i p(W = w i ) lg p(W = w i ) = – (2/5) lg (2/5) – (2/5) lg (2/5) – (1/5) lg (1/5) = – (4/5) + lg 5 ≈ –1.52 If all equally likely to win, H(W) = lg 3 = 1.58

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-12 Joint Entropy X takes values from { x 1, …, x n } –  i p(X=x i ) = 1 Y takes values from { y 1, …, y m } –  i p(Y=y i ) = 1 Joint entropy of X, Y is: –H(X, Y) = –  j  i p(X=x i, Y=y j ) lg p(X=x i, Y=y j )

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-13 Example X: roll of fair die, Y: flip of coin p(X=1, Y=heads) = p(X=1) p(Y=heads) = 1/12 –As X and Y are independent H(X, Y) = –  j  i p(X=x i, Y=y j ) lg p(X=x i, Y=y j ) = –2 [ 6 [ (1/12) lg (1/12) ] ] = lg 12

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-14 Conditional Entropy X takes values from { x 1, …, x n } –  i p(X=x i ) = 1 Y takes values from { y 1, …, y m } –  i p(Y=y i ) = 1 Conditional entropy of X given Y=y j is: –H(X | Y=y j ) = –  i p(X=x i | Y=y j ) lg p(X=x i | Y=y j ) Conditional entropy of X given Y is: –H(X | Y) = –  j p(Y=y j )  i p(X=x i | Y=y j ) lg p(X=x i | Y=y j )

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-15 Example X roll of red die, Y sum of red, blue roll Note p(X=1|Y=2) = 1, p(X=i|Y=2) = 0 for i ≠ 1 –If the sum of the rolls is 2, both dice were 1 H(X|Y=2) = –  i p(X=x i |Y=2) lg p(X=x i |Y=2) = 0 Note p(X=i,Y=7) = 1/6 –If the sum of the rolls is 7, the red die can be any of 1, …, 6 and the blue die must be 7–roll of red die H(X|Y=7) = –  i p(X=x i |Y=7) lg p(X=x i |Y=7) = –6 (1/6) lg (1/6) = lg 6

June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-16 Perfect Secrecy Cryptography: knowing the ciphertext does not decrease the uncertainty of the plaintext M = { m 1, …, m n } set of messages C = { c 1, …, c n } set of messages Cipher c i = E(m i ) achieves perfect secrecy if H(M | C) = H(M)