Entropy CSCI284/162 Spring 2009 GWU.

Slides:



Advertisements
Similar presentations
Designing Investigations to Predict Probabilities Of Events.
Advertisements

Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Final Exam: May 10 Thursday. If event E occurs, then the probability that event H will occur is p ( H | E ) IF E ( evidence ) is true THEN H ( hypothesis.
Probability theory and average-case complexity. Review of probability theory.
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-1 Chapter 32: Entropy and Uncertainty Conditional, joint probability Entropy.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
UNR, MATH/STAT 352, Spring Head Tail Tossing a symmetric coin You are paying $1 How much should you get to make the game fair?
Binomial Distributions Calculating the Probability of Success.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
Math 15 – Elementary Statistics Sections 7.1 – 7.3 Probability – Who are the Frequentists?
4.1 Probability Distributions. Do you remember? Relative Frequency Histogram.
Binomial Probability Distribution
Created by Tom Wegleitner, Centreville, Virginia Section 3-6 Probabilities Through Simulations.
VOCABULARY CHECK Prerequisite Skills Copy and complete using a review word from the list: data, mean, median, range, outcome, probability of an event.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Information Theory Basics What is information theory? A way to quantify information A lot of the theory comes from two worlds Channel.
List one thing that has a probability of 0?. agenda 1) notes on probability 2) lesson 1 example 1, 2 Exercise 5-8 Problem set 1-3 3)start lesson 3.
Simulating Probabilistic Behaviour
The Wonderful World… of Probability. When do we use Probability?
5.1 Probability in our Daily Lives.  Which of these list is a “random” list of results when flipping a fair coin 10 times?  A) T H T H T H T H T H 
WOULD YOU PLAY THIS GAME? Roll a dice, and win $1000 dollars if you roll a 6.
1 Keep Life Simple! We live and work and dream, Each has his little scheme, Sometimes we laugh; sometimes we cry, And thus the days go by.
Presented by Minkoo Seo March, 2006
Independent and Dependent Events. Independent Events Two events are independent if the outcome of one event does not affect the outcome of a second event.
By:Tehya Pugh. What is Theoretical Probability  Theoretical Probability Is what you predict what will happen without really doing the experiment.  I.
Making Predictions with Theoretical Probability. Warm Up You flip a coin three times. 1.Create a tree diagram to find the sample space. 2.How many outcomes.
CSC317 1 Randomized algorithms Hiring problem We always want the best hire for a job! Using employment agency to send one candidate at a time Each day,
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
Probability Quiz. Question 1 If I throw a fair dice 30 times, how many FIVES would I expect to get?
Probability theory and average-case complexity. Review of probability theory.
Information Bottleneck versus Maximum Likelihood Felix Polyakov.
Random Variables If  is an outcome space with a probability measure and X is a real-valued function defined over the elements of , then X is a random.
When could two experimental probabilities be equal? Question of the day.
How likely are you to have earned a “A” on the test?
Probability. We use the word probably when there is some level of ignorance as to what an outcome will be.
Probability 6.4. Outcomes Possible results of an action Examples: – 6 outcomes for rolling a dice (1,2,3,4,56) – 2 outcomes for flipping a coin (heads.
Math 1320 Chapter 7: Probability 7.3 Probability and Probability Models.
Discrete Math Section 16.1 Find the sample space and probability of multiple events The probability of an event is determined empirically if it is based.
Binomial Distribution (Dr. Monticino). Assignment Sheet  Read Chapter 15  Assignment # 9 (Due March 30 th )  Chapter 15  Exercise Set A: 1-6  Review.
From Randomness to Probability
Sec. 4-5: Applying Ratios to Probability
Algorithms CSCI 235, Fall 2017 Lecture 10 Probability II
Discrete and Continuous Random Variables
Chapter 32: Entropy and Uncertainty
From Randomness to Probability
PROBABILITY The probability of an event is a value that describes the chance or likelihood that the event will happen or that the event will end with.
From Randomness to Probability
From Randomness to Probability
A Brief Introduction to Information Theory
Chapter 17 Thinking about Chance.
Probability Trees By Anthony Stones.
Introduction to: PROBABILITY.
A D D A D.
MATH 2311 Section 3.2.
6.2/6.3 Probability Distributions and Distribution Mean
Write each fraction in simplest form
Probability Vocabulary:
Created by Michele Hinkle and Jeffrey Fries March 2005
Entropy and Uncertainty
MATH 2311 Section 3.2.
Probability and Punnett Squares
Probability Notes Please fill in the blanks on your notes to complete them. Please keep all notes throughout the entire week and unit for use on the quizzes.
Probability of TWO EVENTS
Algorithms CSCI 235, Spring 2019 Lecture 10 Probability II
MAS2317- Introduction to Bayesian Statistics
 I can construct models to represent the probability of compound events.  
Lecture 9 Randomized Algorithms
Homework Due Tomorrow mrsfhill.weebly.com.
Presentation transcript:

Entropy CSCI284/162 Spring 2009 GWU

Measurement of Uncertainty Flip a fair coin Is it reasonable to say that the outcome has one bit of uncertainty? Flip the coin n times – how much uncertainty in the outcome? Is it reasonable to say: an event with probability 2-n has uncertainty n? That is, uncertainty is –log(probability)? 4/29/2019 CS284/Spring09/GWU/Vora/Entropy

CS284/Spring09/GWU/Vora/Entropy What if The coin is biased so that it always shows heads? How much uncertainty? What if coin is biased so it shows heads with probability p and tails with probability 1-p? If uncertainty is -log probability, take its average value: -plogp - (1-p)log(1-p) 4/29/2019 CS284/Spring09/GWU/Vora/Entropy

CS284/Spring09/GWU/Vora/Entropy Shannon Entropy If a random variable X takes on values Xi with probability pi, the entropy of X is defined as: H (X) = - i pi log pi Loosely speaking, it is the average number of bits required to represent the variable 4/29/2019 CS284/Spring09/GWU/Vora/Entropy