Chapter 32: Entropy and Uncertainty

Slides:



Advertisements
Similar presentations
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Advertisements

Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Probability Three basic types of probability: Probability as counting
Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
0 0 Review Probability Axioms –Non-negativity P(A)≥0 –Additivity P(A U B) =P(A)+ P(B), if A and B are disjoint. –Normalization P(Ω)=1 Independence of two.
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Games of probability What are my chances?. Roll a single die (6 faces). –What is the probability of each number showing on top? Activity 1: Simple probability:
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
PROBABILITY  A fair six-sided die is rolled. What is the probability that the result is even?
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 4-1 Introduction to Statistics Chapter 5 Random Variables.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Section 1.2 Suppose A 1, A 2,..., A k, are k events. The k events are called mutually exclusive if The k events are called mutually exhaustive if A i 
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
Information Theory and Security. Lecture Motivation Up to this point we have seen: –Classical Crypto –Symmetric Crypto –Asymmetric Crypto These systems.
Chapter 7 Expectation 7.1 Mathematical expectation.
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-1 Chapter 32: Entropy and Uncertainty Conditional, joint probability Entropy.
Bell Work: Factor x – 6x – Answer: (x – 8)(x + 2)
Information Theory and Security
Chapter6 Jointly Distributed Random Variables
What are the chances of that happening?. What is probability? The mathematical expression of the chances that a particular event or outcome will happen.
Warm up: Solve each system (any method). W-up 11/4 1) Cars are being produced by two factories, factory 1 produces twice as many cars (better management)
§1 Entropy and mutual information
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
From Randomness to Probability
Math 15 – Elementary Statistics Sections 7.1 – 7.3 Probability – Who are the Frequentists?
Chapter 5.1 Probability Distributions.  A variable is defined as a characteristic or attribute that can assume different values.  Recall that a variable.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
1 Information Theory Nathanael Paul Oct. 09, 2002.
1.4 Equally Likely Outcomes. The outcomes of a sample space are called equally likely if all of them have the same chance of occurrence. It is very difficult.
EXAMPLE 1 Independent and Dependent Events Tell whether the events are independent or dependent. SOLUTION You randomly draw a number from a bag. Then you.
7.4 Probability of Independent Events 4/17/ What is the number of unique 4-digit ATM PIN codes if the first number cannot be 0? The numbers to.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
What is the probability of two or more independent events occurring?
Probability Quiz. Question 1 If I throw a fair dice 30 times, how many FIVES would I expect to get?
Experiments, Outcomes and Events. Experiment Describes a process that generates a set of data – Tossing of a Coin – Launching of a Missile and observing.
Independent and Dependent Events Lesson 6.6. Getting Started… You roll one die and then flip one coin. What is the probability of : P(3, tails) = 2. P(less.
Probability Distribution. Probability Distributions: Overview To understand probability distributions, it is important to understand variables and random.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Math 1320 Chapter 7: Probability 7.3 Probability and Probability Models.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
16.6 Expected Value.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
Binomial Distribution (Dr. Monticino). Assignment Sheet  Read Chapter 15  Assignment # 9 (Due March 30 th )  Chapter 15  Exercise Set A: 1-6  Review.
Introduction to Discrete Probability
ICS 253: Discrete Structures I
Chapter 6: Discrete Probability
Sec. 4-5: Applying Ratios to Probability
Chapter 6 6.1/6.2 Probability Probability is the branch of mathematics that describes the pattern of chance outcomes.
Quantifying Uncertainty
Conditional Probability
PROBABILITY The probability of an event is a value that describes the chance or likelihood that the event will happen or that the event will end with.
Probability Review 11/22/2018.
Probability.
A Brief Introduction to Information Theory
Probability Trees By Anthony Stones.
ASV Chapters 1 - Sample Spaces and Probabilities
Probability Review 2/24/2019.
Entropy and Uncertainty
Information Flow.
Applied Discrete Mathematics Week 12: Discrete Probability
5. Conditioning and Independence
Entropy CSCI284/162 Spring 2009 GWU.
Probability of TWO EVENTS
Chapter 3: Independent Events
A random experiment gives rise to possible outcomes, but any particular outcome is uncertain – “random”. For example, tossing a coin… we know H or T will.
Chapter 11 Probability.
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

Chapter 32: Entropy and Uncertainty Conditional, joint probability Entropy and uncertainty Joint entropy Conditional entropy Perfect secrecy June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Overview Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Overview Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Random Variable Variable that represents outcome of an event X represents value from roll of a fair die; probability for rolling n: p(X = n) = 1/6 If die is loaded so 2 appears twice as often as other numbers, p(X = 2) = 2/7 and, for n ≠ 2, p(X = n) = 1/7 Note: p(X) means specific value for X doesn’t matter Example: all values of X are equiprobable June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Overview Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Joint Probability Joint probability of X and Y, p(X, Y), is probability that X and Y simultaneously assume particular values If X, Y independent, p(X, Y) = p(X)p(Y) Roll die, toss coin p(X = 3, Y = heads) = p(X = 3)p(Y = heads) = 1/6  1/2 = 1/12 June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Two Dependent Events X = roll of red die, Y = sum of red, blue die rolls Formula: p(X=1, Y=11) = p(X=1)p(Y=11) = (1/6)(2/36) = 1/108 p(Y=2) = 1/36 p(Y=3) = 2/36 p(Y=4) = 3/36 p(Y=5) = 4/36 p(Y=6) = 5/36 p(Y=7) = 6/36 p(Y=8) = 5/36 p(Y=9) = 4/36 p(Y=10) = 3/36 p(Y=11) = 2/36 p(Y=12) = 1/36 June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Overview Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Conditional Probability Conditional probability of X given Y, p(X|Y), is probability that X takes on a particular value given Y has a particular value Continuing example … p(Y=7|X=1) = 1/6 p(Y=7|X=3) = 1/6 June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Relationship p(X, Y) = p(X | Y) p(Y) = p(X) p(Y | X) Example: p(X=3,Y =8) = p(X=3|Y =8) p(Y =8) = (1/5)(5/36) = 1/36 Note: if X, Y independent: p(X|Y) = p(X) June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Overview Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Entropy Uncertainty of a value, as measured in bits Example: X value of fair coin toss; X could be heads or tails, so 1 bit of uncertainty Therefore entropy of X is H(X) = 1 Formal definition: random variable X, values x1, …, xn; so i p(X = xi) = 1 H(X) = –i p(X = xi) lg p(X = xi) June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Heads or Tails? H(X) = – p(X=heads) lg p(X=heads) – p(X=tails) lg p(X=tails) = – (1/2) lg (1/2) – (1/2) lg (1/2) = – (1/2) (–1) – (1/2) (–1) = 1 Confirms previous intuitive result June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science n-Sided Fair Die H(X) = –i p(X = xi) lg p(X = xi) As p(X = xi) = 1/n, this becomes H(X) = –i (1/n) lg (1/ n) = –n(1/n) (–lg n) so H(X) = lg n which is the number of bits in n, as expected June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Ann, Pam, and Paul Ann, Pam twice as likely to win as Paul W represents the winner. What is its entropy? w1 = Ann, w2 = Pam, w3 = Paul p(W= w1) = p(W= w2) = 2/5, p(W= w3) = 1/5 So H(W) = –i p(W = wi) lg p(W = wi) = – (2/5) lg (2/5) – (2/5) lg (2/5) – (1/5) lg (1/5) = – (4/5) + lg 5 ≈ –1.52 If all equally likely to win, H(W) = lg 3 = 1.58 June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Overview Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Joint Entropy X takes values from { x1, …, xn } i p(X=xi) = 1 Y takes values from { y1, …, ym } i p(Y=yi) = 1 Joint entropy of X, Y is: H(X, Y) = –j i p(X=xi, Y=yj) lg p(X=xi, Y=yj) June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Example X: roll of fair die, Y: flip of coin p(X=1, Y=heads) = p(X=1) p(Y=heads) = 1/12 As X and Y are independent H(X, Y) = –j i p(X=xi, Y=yj) lg p(X=xi, Y=yj) = –2 [ 6 [ (1/12) lg (1/12) ] ] = lg 12 June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Overview Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Conditional Entropy X takes values from { x1, …, xn } i p(X=xi) = 1 Y takes values from { y1, …, ym } i p(Y=yi) = 1 Conditional entropy of X given Y=yj is: H(X | Y=yj) = –i p(X=xi | Y=yj) lg p(X=xi | Y=yj) Conditional entropy of X given Y is: H(X | Y) = –j p(Y=yj) i p(X=xi | Y=yj) lg p(X=xi | Y=yj) June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Example X roll of red die, Y sum of red, blue roll Note p(X=1|Y=2) = 1, p(X=i|Y=2) = 0 for i ≠ 1 If the sum of the rolls is 2, both dice were 1 H(X|Y=2) = –i p(X=xi|Y=2) lg p(X=xi|Y=2) = 0 Note p(X=i,Y=7) = 1/6 If the sum of the rolls is 7, the red die can be any of 1, …, 6 and the blue die must be 7–roll of red die H(X|Y=7) = –i p(X=xi|Y=7) lg p(X=xi|Y=7) = –6 (1/6) lg (1/6) = lg 6 June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Overview Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop

Computer Security: Art and Science Perfect Secrecy Cryptography: knowing the ciphertext does not decrease the uncertainty of the plaintext M = { m1, …, mn } set of messages C = { c1, …, cn } set of messages Cipher ci = E(mi) achieves perfect secrecy if H(M | C) = H(M) June 1, 2004 Computer Security: Art and Science ©2002-2004 Matt Bishop