MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.

Slides:



Advertisements
Similar presentations
Discrete Random Variables To understand what we mean by a discrete random variable To understand that the total sample space adds up to 1 To understand.
Advertisements

Lecture 18 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Basic Terms of Probability Section 3.2. Definitions Experiment: A process by which an observation or outcome is obtained. Sample Space: The set S of all.
Section 2 Union, Intersection, and Complement of Events, Odds
Copyright © Cengage Learning. All rights reserved.
Copyright © Cengage Learning. All rights reserved. 8.6 Probability.
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Chapter 4 Using Probability and Probability Distributions
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 4-1 Business Statistics: A Decision-Making Approach 7 th Edition Chapter.
Probability Probability Principles of EngineeringTM
Ka-fu Wong © 2003 Chap 6- 1 Dr. Ka-fu Wong ECON1003 Analysis of Economic Data.
Random Variables A Random Variable assigns a numerical value to all possible outcomes of a random experiment We do not consider the actual events but we.
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Probability Distributions: Finite Random Variables.
CHAPTER 10: Introducing Probability
Chapter 6 Probability.
Stat 1510: Introducing Probability. Agenda 2  The Idea of Probability  Probability Models  Probability Rules  Finite and Discrete Probability Models.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
Introduction In probability, events are either dependent or independent. Two events are independent if the occurrence or non-occurrence of one event has.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
Probability Distributions. Essential Question: What is a probability distribution and how is it displayed?
From Randomness to Probability
College Algebra Sixth Edition James Stewart Lothar Redlin Saleem Watson.
OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3.
Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review Instructor: Anirban Mahanti Office: ICT Class.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Random Variable. Random variable A random variable χ is a function (rule) that assigns a number to each outcome of a chance experiment. A function χ acts.
Copyright © Cengage Learning. All rights reserved. 8.6 Probability.
Barnett/Ziegler/Byleen Finite Mathematics 11e1 Learning Objectives for Section 8.5 The student will be able to identify what is meant by a random variable.
22C:19 Discrete Structures Discrete Probability Spring 2014 Sukumar Ghosh.
Random Variables an important concept in probability.
MM207 Statistics Welcome to the Unit 7 Seminar With Ms. Hannahs.
Section 2 Union, Intersection, and Complement of Events, Odds
Probability Distributions
Random Variables Learn how to characterize the pattern of the distribution of values that a random variable may have, and how to use the pattern to find.
Mathematics Probability: Events Science and Mathematics Education Research Group Supported by UBC Teaching and Learning Enhancement Fund Department.
President UniversityErwin SitompulPBST 3/1 Dr.-Ing. Erwin Sitompul President University Lecture 3 Probability and Statistics
Chapter 4 Random Variables - 1. Outline Random variables Discrete random variables Expected value 2.
Stat 31, Section 1, Last Time Big Rules of Probability –The not rule –The or rule –The and rule P{A & B} = P{A|B}P{B} = P{B|A}P{A} Bayes Rule (turn around.
Discrete Math Section 16.3 Use the Binomial Probability theorem to find the probability of a given outcome on repeated independent trials. Flip a coin.
Chapter 4 Probability 4-1 Review and Preview 4-2 Basic Concepts of Probability 4-3 Addition Rule 4-4 Multiplication Rule: Basics 4-5 Multiplication Rule:
No Warm-Up today. You have a Quiz Clear your desk of everything but a calculator and something to write with.
3/7/20161 Now it’s time to look at… Discrete Probability.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
The Law of Averages. What does the law of average say? We know that, from the definition of probability, in the long run the frequency of some event will.
Week 5 Discrete Random Variables and Probability Distributions Statistics for Social Sciences.
Chapter 10 PROBABILITY. Probability Terminology  Experiment: take a measurement Like flipping a coin  Outcome: one possible result of an experiment.
1 What Is Probability?. 2 To discuss probability, let’s begin by defining some terms. An experiment is a process, such as tossing a coin, that gives definite.
Lesson 10: Using Simulation to Estimate a Probability Simulation is a procedure that will allow you to answer questions about real problems by running.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
What Is Probability?.
Copyright © 2016, 2013, and 2010, Pearson Education, Inc.
PROBABILITY AND PROBABILITY RULES
22C:19 Discrete Math Discrete Probability
CS104:Discrete Structures
Conditional Probability
CHAPTER 6 Random Variables
CHAPTER 12: Introducing Probability
Random Variable, Probability Distribution, and Expected Value
Probability Principles of Engineering
Probability Principles of Engineering
Chapter 9 Section 1 Probability Review.
Warm Up Which of the following are combinations?
Probability Principles of Engineering
Probability Probability Principles of EngineeringTM
Probability Principles of Engineering
Presentation transcript:

MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE 04 Random Variables Fall 2011

What is a random variable Random Variables – A random variable X is not a “variable” like algebra – A random variable X is a function: From a set of outcomes of a random event (the sample space S of an experiment) To the set of real numbers Realizations of a random variable are called random variates.random variates 4/10/2011Lecture 32 1 Set of outcomes of a coin toss Random Variable ℝ heads tails

Example Experiment: throw 3 coins Sample Space: S = {(H,H,H), (H,H,T), (H,T,H), (T,H,H), (H, T, T), (T,H,T), (T,T,H),(T,T,T)} Y is a random variable, giving the number of heads that landed: 4/10/2011Lecture 33 (H,H,H) (H,H,T) (H,T,H) (T,H,H) (H,T,T) (T,H,T) (T,T,H) (T,T,T)

4/10/2011Lecture 34 Three balls are to be randomly selected without replacement from an urn containing 20 balls numbered 1 through 20. If we bet that at least one of the balls that are drawn has a number as large as or larger than 17, what is the probability that we win the bet? Let X be the largest of the three numbers drawn.

4/10/2011Lecture 35 Independent trials consisting of the flipping of a coin having probability p of coming up heads are continually performed until either a head occurs or a total of n flips is made. If we let X denote the number of times the coin is flipped, then X is a random variable taking on one of the values 1, 2, 3,..., n with respective probabilities:

4/10/2011Lecture 36 Three balls are randomly chosen from an urn containing 3 white, 3 red, and 5 black balls. Suppose that we win $1 for each white ball selected and lose $1 for each red ball selected. If we let X denote our total winnings from the experiment, then X is a random variable taking on the possible values 0, 1, 2, 3 with respective probabilities Suppose every ball has a number. Then your balls are: W1, W2, W3, R1, R2, R3, B1, B2, B3, B4, B5 Or for convenience I will number them from 1 to 11. So there are ways to choose three balls from this set.

4/10/2011Lecture 37 The list of possible values for X is {-3,-2,-1,0,1,2,3} To get -3, we must choose RRR. To get -2, we must choose 2 R and one B To get -1, we must choose 2 R and one W or one R and two B. To get 0, we must choose one R, one W and one B or BBB To get +1, we must choose 2 W and one R or one W and two B To get +2, we must choose 2W and one B To get +3, we must choose WWW. So:

4/10/2011Lecture 38

The cumulative distribution function For a random variable X, the function F defined by is called the cumulative distribution function, or, the distribution function, of X. Thus, the distribution function specifies, for all real values x, the probability that the random variable is less than or equal to x. F(x) is a nondecreasing function of x, that is, If a < b then F(a) < F(b). 4/10/2011Lecture 39

For the previous example: 4/10/2011Lecture 310

For the previous example: 4/10/2011Lecture 311

Probability Mass Function Is defined for a discrete variable X. 4/10/2011Lecture 312 Suppose that Then since x must be one of the values x i,

Example of probability mass function 4/10/2011Lecture 313

Example The probability mass function of a random variable X is given by i=0,1,2,… where λ is some positive value. Find (a) P{X = 0} and (b) P{X > 2}. 4/10/2011Lecture 314

The cumulative distribution function The cumulative distribution function F can be expressed in terms of p(a) by 4/10/2011Lecture 315 If X is a discrete random variable whose possible values are x 1, x 2, x 3, … where x 1 < x 2 < x 3 < … then the distribution function F of X is a step function.

Example 4/10/2011Lecture 316 then the distribution function F of X is For example, suppose the probability mass function (pmf) of X is

Expectation of a random variable If X is a discrete random variable having a probability mass function p(x) then the expectation or the expected value of X denoted by E[X] is defined by 4/10/2011Lecture 317 In other words, Take every possible value for X Multiply it by the probability of getting that value Add the result.

Examples of expectation For example, suppose you have a fair coin. You flip the coin, and define a random variable X such that – If the coin lands heads, X = 1 – If the coin lands tails, X = 2 Then the probability mass function of X is given by 4/10/2011Lecture 318 Or we can write

Examples of expectation Next, suppose you throw a fair die. You flip the die, and define a random variable Y such that – If the die lands a number less than or equal to 5, then Y = 0 – If the die lands a number greater than 5, then Y = 1 Then the probability mass function of Y is given by 4/10/2011Lecture 319

Frequency interpretation of probabilities The law of large numbers – we will see in chapter 8 – assumes that if we have an experiment (e.g. tossing a coin) and we perform it an infinite number of times, then the proportion of time that any event E occurs will be P(E). [Recall here than event means a subset of the sample space, or a set of outcomes for the experiment] So for instance suppose X is a random variable which will be equal to x 1 with probability p(x 1 ), x 2 with probability p(x 2 ), …, x n with probability p(x n ). By the frequency interpretation, if we keep playing this game, then the proportion of time that we win x i will be p(x i ). 4/10/2011Lecture 320

Frequency interpretation of probabilities Or we can say that when we play the game N times, where N is a very big number, we will win x i about Np(x i ) times. Then the average winnings per game will be: 4/10/2011Lecture 321

Example 3a Question: – Find E[X] where X is the outcome when we roll a fair die. Solution: – Since 4/10/2011Lecture 322

Example 3b Question: – We say that I is an indicator variable for an event A if 4/10/2011Lecture 323 – What is E[I] ?

Example 3d A school class of 120 students is driven in 3 buses to a symphonic performance. There are 36 students in one of the busses, 40 in another, and 44 in the third bus. When the busses arrive, one of the 120 students is randomly chosen. Let X denote the number of students on the bus of that randomly chosen student, and find E[X]. Solution: 4/10/2011Lecture 324

Example 3d Same problem as before, but assume that the bus is chosen randomly instead of the student, and find E[X]. Solution: 4/10/2011Lecture 325

Expectation of a function of a random variable To find E[g(x)], that is, the expectation of g(X) Two step process: – find the pmf of g(x) – find E[g(x)] 4/10/2011Lecture 326

4/10/2011Lecture 327 Let X denote a random variable that takes on any of the values –1, 0, and 1 with respective probabilities Compute Solution Let Y = X 2. Then the probability mass function of Y is given by

Statistics vs. Probability You may have noticed that the concept of “expectation” seems a lot like the concept of “average”. So why do we use this fancy new word “expectation”? Why not just call it “average”? We find the average of a list of numbers. The numbers are already known. We find the expectation of a random variable. We may have only one such random variable. We may only toss the coin or die once. 4/10/2011Lecture 328

Statistics vs. Probability For instance, let us define a random variable X using the result of a coin toss: let X = 1 if the coin lands heads, X = 0 if the coin lands tails. If we perform this experiment K times, we will get a list of values for X. We can find the average value for K by adding all the values for X, and dividing by K. 4/10/2011Lecture 329 Is this coin fair? We don’t know, but we can find out.

Statistics vs. Probability What we did on the previous slide was statistics: we analyzed the data to draw some conclusions about the process or mechanism (i.e. the coin) that generated that data. Probability is how we draw conclusions about the future. So suppose I did the experiments on the previous slide yesterday. Today I will come into the class and toss the coin exactly once. Then I can use the statistics from yesterday to help find out what I can expect the result of the coin toss to be today: 4/10/2011Lecture 330

Statistics vs. Probability Okay, so I got 0.5. What does this mean? X can never equal 0.5. Expectation makes more sense with continuous random variables, e.g. when you measure a voltage on a voltmeter. With the coin toss you can think of it this way: Suppose someone wants you to guess X. But you will pay a lot of money if you’re wrong, and the money you pay is proportional to how wrong you are. If you guess g, and the result was actually a, then you have to pay What should you guess? You must minimize If you guess g=E[X], then this penalty is minimized. 4/10/2011Lecture 331

Statistics: how to find the pmf of a random voltage from measurements Suppose you are going to measure a voltage. You know that the voltage is really about 5V. But you have an old voltmeter that doesn’t measure very well. The voltmeter is digital and has 1 decimal place. So you can only read voltages 0.00, 0.1, …, 4.7, 4.8, 4.9, 5.0, 5.1, …, 9.9. You start measuring the voltage. You get the following measurements: 4.7, 5.0, 4.9, 5.0, 5.3, 4.9, 4.8, 5.2, … From these measurements you can construct a probability mass function graph as follows. 4/10/2011Lecture 332

Pmf drawn from results of experiment 4/10/2011Lecture Measurements: 4.7, 5.0, 4.9, 5.0, 5.3, 4.9, 4.8, 5.2,5.0, 4.5, 4.8, 5.1, 5.0, 5.1, 4.9, 5.3, 5.1, 5.2, 5.1,

And to show this with animation 4/10/2011Lecture Measurements: 4.7, 5.0, 4.9, 5.0, 5.3, 4.9, 4.8, 5.2,5.0, 4.5, 4.8, 5.1, 5.0, 5.1, 4.9, 5.3, 5.1, 5.2, 5.1,

pmf derived mathematically Based on the frequency interpretation, we can define the pmf as follows: 4/10/2011Lecture 335 Now I can predict the future based on this pmf. Probability does not bother with data. Statistics is all about data.

Statistics vs. Probability Are these the correct probabilities? I don’t know. Even if we ran the experiment millions of times, we would be wrong, probably a little wrong, maybe even very wrong. It is always possible to throw 1000 heads in a row even with a fair die, although it is very unlikely that this will happen. In any case, when studying probability we are not concerned with whether the pmf is correct for this experiment, because we do not care about experiments or data. Statisticians, or the people who designed this experiment must take care to design it well, so they can give us a good statistical model. All we know is the statistical model (that is the pmf) and we derive, mathematically, predictions about the future based on this pmf. 4/10/2011Lecture 336