Entropy and Uncertainty

Slides:



Advertisements
Similar presentations
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Advertisements

Lecture 13 Elements of Probability CSCI – 1900 Mathematics for Computer Science Fall 2014 Bill Pine.
Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Probability Three basic types of probability: Probability as counting
Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
0 0 Review Probability Axioms –Non-negativity P(A)≥0 –Additivity P(A U B) =P(A)+ P(B), if A and B are disjoint. –Normalization P(Ω)=1 Independence of two.
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Games of probability What are my chances?. Roll a single die (6 faces). –What is the probability of each number showing on top? Activity 1: Simple probability:
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
PROBABILITY  A fair six-sided die is rolled. What is the probability that the result is even?
Introduction to Probability
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Section 1.2 Suppose A 1, A 2,..., A k, are k events. The k events are called mutually exclusive if The k events are called mutually exhaustive if A i 
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
Information Theory and Security. Lecture Motivation Up to this point we have seen: –Classical Crypto –Symmetric Crypto –Asymmetric Crypto These systems.
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-1 Chapter 32: Entropy and Uncertainty Conditional, joint probability Entropy.
Bell Work: Factor x – 6x – Answer: (x – 8)(x + 2)
Information Theory and Security
Chapter6 Jointly Distributed Random Variables
What are the chances of that happening?. What is probability? The mathematical expression of the chances that a particular event or outcome will happen.
Warm up: Solve each system (any method). W-up 11/4 1) Cars are being produced by two factories, factory 1 produces twice as many cars (better management)
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
From Randomness to Probability
Math 15 – Elementary Statistics Sections 7.1 – 7.3 Probability – Who are the Frequentists?
Chapter 5.1 Probability Distributions.  A variable is defined as a characteristic or attribute that can assume different values.  Recall that a variable.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
1 Information Theory Nathanael Paul Oct. 09, 2002.
1.4 Equally Likely Outcomes. The outcomes of a sample space are called equally likely if all of them have the same chance of occurrence. It is very difficult.
EXAMPLE 1 Independent and Dependent Events Tell whether the events are independent or dependent. SOLUTION You randomly draw a number from a bag. Then you.
7.4 Probability of Independent Events 4/17/ What is the number of unique 4-digit ATM PIN codes if the first number cannot be 0? The numbers to.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
What is the probability of two or more independent events occurring?
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Lecture 3 Appendix 1 Computation of the conditional entropy.
Probability Quiz. Question 1 If I throw a fair dice 30 times, how many FIVES would I expect to get?
Experiments, Outcomes and Events. Experiment Describes a process that generates a set of data – Tossing of a Coin – Launching of a Missile and observing.
Math 1320 Chapter 7: Probability 7.3 Probability and Probability Models.
16.6 Expected Value.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
Introduction to Discrete Probability
Introduction to Discrete Probability
Expected values of games
Chapter 6: Discrete Probability
Sec. 4-5: Applying Ratios to Probability
Chapter 4 Using Probability and Probability Distributions
Subtopic : 10.1 Events and Probability
Chapter 32: Entropy and Uncertainty
PROBABILITY The probability of an event is a value that describes the chance or likelihood that the event will happen or that the event will end with.
Information-Theoretic Secrecy
Probability Review 11/22/2018.
Probability.
A Brief Introduction to Information Theory
Probability Trees By Anthony Stones.
ASV Chapters 1 - Sample Spaces and Probabilities
Probability Key Questions
Probability Review 2/24/2019.
Information Flow.
Applied Discrete Mathematics Week 12: Discrete Probability
5. Conditioning and Independence
Entropy CSCI284/162 Spring 2009 GWU.
Probability of TWO EVENTS
Lectures prepared by: Elchanan Mossel Yelena Shvets
Concepts of Computation
Chapter 3: Independent Events
A random experiment gives rise to possible outcomes, but any particular outcome is uncertain – “random”. For example, tossing a coin… we know H or T will.
Chapter 11 Probability.
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

Entropy and Uncertainty Appendix C Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Outline Random variables Joint probability Conditional probability Entropy (or uncertainty in bits) Joint entropy Conditional entropy Applying it to secrecy of ciphers Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Random Variable Variable that represents outcome of an event X represents value from roll of a fair die; probability for rolling n: p( =n) = 1/6 If die is loaded so 2 appears twice as often as other numbers, p(X=2) = 2/7 and, for n ≠ 2, p(X=n) = 1/7 Note: p(X) means specific value for X doesn’t matter Example: all values of X are equiprobable Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Joint Probability Joint probability of X and Y, p(X, Y), is probability that X and Y simultaneously assume particular values If X, Y independent, p(X, Y) = p(X)p(Y) Roll die, toss coin p(X=3, Y=heads) = p(X=3)p(Y=heads) = 1/6  1/2 = 1/12 Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Two Dependent Events X = roll of red die, Y = sum of red, blue die rolls Formula: p(X=1, Y=11) = p(X=1)p(Y=11) = (1/6)(2/36) = 1/108 p(Y=2) = 1/36 p(Y=3) = 2/36 p(Y=4) = 3/36 p(Y=5) = 4/36 p(Y=6) = 5/36 p(Y=7) = 6/36 p(Y=8) = 5/36 p(Y=9) = 4/36 p(Y=10) = 3/36 p(Y=11) = 2/36 p(Y=12) = 1/36 Version 1.0 Computer Security: Art and Science, 2nd Edition

Conditional Probability Conditional probability of X given Y, p(X | Y), is probability that X takes on a particular value given Y has a particular value Continuing example … p(Y=7 | X=1) = 1/6 p(Y=7 | X=3) = 1/6 Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Relationship p(X, Y) = p(X | Y) p(Y) = p(X) p(Y | X) Example: p(X=3,Y=8) = p(X=3|Y=8) p(Y=8) = (1/5)(5/36) = 1/36 Note: if X, Y independent: p(X|Y) = p(X) Version 1.0 Computer Security: Art and Science, 2nd Edition

Entropy Uncertainty of a value, as measured in bits Example: X value of fair coin toss; X could be heads or tails, so 1 bit of uncertainty Therefore entropy of X is H(X) = 1 Formal definition: random variable X, values x1, …, xn; so i p(X = xi) = 1; then entropy is: H(X) = –i p(X=xi) lg p(X=xi) Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Heads or Tails? H(X) = – p(X=heads) lg p(X=heads) – p(X=tails) lg p(X=tails) = – (1/2) lg (1/2) – (1/2) lg (1/2) = – (1/2) (–1) – (1/2) (–1) = 1 Confirms previous intuitive result Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition n-Sided Fair Die H(X) = –i p(X = xi) lg p(X = xi) As p(X = xi) = 1/n, this becomes H(X) = –i (1/n) lg (1/ n) = –n(1/n) (–lg n) so H(X) = lg n which is the number of bits in n, as expected Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Ann, Pam, and Paul Ann, Pam twice as likely to win as Paul W represents the winner. What is its entropy? w1 = Ann, w2 = Pam, w3 = Paul p(W=w1) = p(W=w2) = 2/5, p(W=w3) = 1/5 So H(W) = –i p(W=wi) lg p(W=wi) = – (2/5) lg (2/5) – (2/5) lg (2/5) – (1/5) lg (1/5) = – (4/5) + lg 5 ≈ –1.52 If all equally likely to win, H(W) = lg 3 ≈ 1.58 Version 1.0 Computer Security: Art and Science, 2nd Edition

Joint Entropy X takes values from { x1, …, xn }, and i p(X=xi) = 1 Y takes values from { y1, …, ym }, and i p(Y=yi) = 1 Joint entropy of X, Y is: H(X, Y) = –j i p(X=xi, Y=yj) lg p(X=xi, Y=yj) Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Example X: roll of fair die, Y: flip of coin As X, Y are independent: p(X=1, Y=heads) = p(X=1) p(Y=heads) = 1/12 and H(X, Y) = –j i p(X=xi, Y=yj) lg p(X=xi, Y=yj) = –2 [ 6 [ (1/12) lg (1/12) ] ] = lg 12 Version 1.0 Computer Security: Art and Science, 2nd Edition

Conditional Entropy X takes values from { x1, …, xn } and i p(X=xi) = 1 Y takes values from { y1, …, ym } and i p(Y=yi) = 1 Conditional entropy of X given Y=yj is: H(X | Y=yj) = –i p(X=xi | Y=yj) lg p(X=xi | Y=yj) Conditional entropy of X given Y is: H(X | Y) = –j p(Y=yj) i p(X=xi | Y=yj) lg p(X=xi | Y=yj) Version 1.0 Computer Security: Art and Science, 2nd Edition

Example X roll of red die, Y sum of red, blue roll Note p(X=1|Y=2) = 1, p(X=i|Y=2) = 0 for i ≠ 1 If the sum of the rolls is 2, both dice were 1 Thus H(X|Y=2) = –i p(X=xi|Y=2) lg p(X=xi|Y=2) = 0 Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Example (con’t) Note p(X=i, Y=7) = 1/6 If the sum of the rolls is 7, the red die can be any of 1, …, 6 and the blue die must be 7–roll of red die H(X|Y=7) = –i p(X=xi|Y=7) lg p(X=xi|Y=7) = –6 (1/6) lg (1/6) = lg 6 Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Perfect Secrecy Cryptography: knowing the ciphertext does not decrease the uncertainty of the plaintext M = { m1, …, mn } set of messages C = { c1, …, cn } set of messages Cipher ci = E(mi) achieves perfect secrecy if H(M | C) = H(M) Version 1.0 Computer Security: Art and Science, 2nd Edition