Download presentation
Presentation is loading. Please wait.
Published byMerry Hoover Modified over 9 years ago
1
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions
2
2 Statistical and Inductive Probability Statistical: Relative frequency of occurrence after many trials Inductive: Degree of belief on certain event We will be concerned with the statistical view only. 0.5 Number of flips of a coin Proportion of heads Law of large numbers
3
3 The Sample Space The space of all possible outcomes of a given process or situation is called the sample space S. or situation is called the sample space S. Example: cars crossing a check point based on color and size: S red & small blue & small red & large blue & large
4
4 An Event An event is a subset of the sample space. Example: Event A: red cars crossing a check point irrespective of size S red & small blue & small red & large blue & large A
5
5 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions
6
6 The Laws of Probability The probability of the sample space S is 1, P(S) = 1 The probability of any event A is such that 0 <= P(A) <= 1. Law of Addition If A and B are mutually exclusive events, then the probability that either one of them will occur is the sum of the individual probabilities: P(A or B) = P(A) + P(B) P(A or B) = P(A) + P(B) If A and B are not mutually exclusive: If A and B are not mutually exclusive: P(A or B) = P(A) + P(B) – P(A and B) P(A or B) = P(A) + P(B) – P(A and B) A B
7
7 Conditional Probabilities Given that A and B are events in sample space S, and P(B) is different of 0, then the conditional probability of A given B is different of 0, then the conditional probability of A given B is If A and B are independent then P(A|B) = P(A)
8
8 The Laws of Probability Law of Multiplication What is the probability that both A and B occur together? P(A and B) = P(A) P(B|A) P(A and B) = P(A) P(B|A) where P(B|A) is the probability of B conditioned on A. where P(B|A) is the probability of B conditioned on A. If A and B are statistically independent: If A and B are statistically independent: P(B|A) = P(B) and then P(B|A) = P(B) and then P(A and B) = P(A) P(B) P(A and B) = P(A) P(B)
9
9 Exercises Find the probability that the sum of the numbers on two unbiased dice will be even by considering the probabilities that the individual dice will show an even number.
10
10 Exercises X 1 – first throw X 2 – second throw
11
11 Exercises X 1 – first throw X 2 – second throw Pfinal = P(X 1 =1 & X 2 =1) + P(X 1 =1 & X 2 =3) + P(X 1 =1 & X 2 =5) + P(X 1 =2 & X 2 =2) + P(X 1 =2 & X 2 =4) + P(X 1 =2 & X 2 =6) + P(X 1 =2 & X 2 =2) + P(X 1 =2 & X 2 =4) + P(X 1 =2 & X 2 =6) + P(X 1 =3 & X 2 =1) + P(X 1 =3 & X 2 =3) + P(X 1 =3 & X 2 =5) + P(X 1 =3 & X 2 =1) + P(X 1 =3 & X 2 =3) + P(X 1 =3 & X 2 =5) + … P(X 1 =6 & X 2 =2) + P(X 1 =6 & X 2 =4) + P(X 1 =6 & X 2 =6). P(X 1 =6 & X 2 =2) + P(X 1 =6 & X 2 =4) + P(X 1 =6 & X 2 =6). P final = 18/36 = 1/2
12
12 Exercises Find the probabilities of throwing a sum of a) 3, b) 4 with three unbiased dice.
13
13 Exercises Find the probabilities of throwing a sum of a) 3, b) 4 with three unbiased dice. X = sum of X 1 and X 2 and X 3 P(X=3)? P(X 1 =1 & X 2 =1 & X 3 =1) = 1/216 P(X=4)? P(X 1 =1 & X 2 =1 & X 3 =2) + P(X 1 =1 & X 2 =2 & X 3 =1) + … P(X=4) = 3/216
14
14 Exercises Three men meet by chance. What are the probabilities that a) none of them, b) two of them, c) all of them have the same birthday?
15
15 Exercises None of them have the same birthday X 1 – birthday 1 st person X 2 – birthday 2 nd person X 3 – birthday 3 rd person a)P(X 2 is different than X 1 & X 3 is different than X 1 and X 2 ) P final = (364/365)(363/365)
16
16 Exercises Two of them have the same birthday P(X 1 = X 2 and X 3 is different than X 1 and X 2 ) + P(X 1 =X 3 and X 2 differs) + P(X 2 =X 3 and X 1 differs). P(X 1 =X 2 and X 3 differs) = (1/365)(364/365) P final = 3(1/365)(364/365)
17
17 Exercises All of them have the same birthday P(X 1 = X 2 = X 3 ) P final = (1/365)(1/365)
18
18 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions
19
19 Random Variable Definition: A variable that can take on several values, each value having a probability of occurrence. There are two types of random variables: Discrete. Take on a countable number of values. Continuous. Take on a range of values. Discrete Variables Discrete Variables For every discrete variable X there will be a probability function P(x) = P(X = x). P(x) = P(X = x). The cumulative probability function for X is defined as F(x) = P(X <= x). F(x) = P(X <= x).
20
20 Random Variable Continuous Variables: Concept of histogram. For every variable X we will associate a probability density function f(x). The probability is the area lying between function f(x). The probability is the area lying between two values. two values. Prob(x 1 < X <= x 2 ) = Prob(x 1 < X <= x 2 ) = The cumulative probability function is defined as F(x) = Prob( X <= x) = F(x) = Prob( X <= x) =
21
21 Multivariate Distributions P(x,y) = P( X = x and Y = y). P’(x) = Prob( X = x) = ∑ y P(x,y) It is called the marginal distribution of X It is called the marginal distribution of X The same can be done on Y to define the marginal The same can be done on Y to define the marginal distribution of Y, P”(y). distribution of Y, P”(y). If X and Y are independent then P(x,y) = P’(x) P”(y) P(x,y) = P’(x) P”(y)
22
22 Expectations: The Mean Let X be a discrete random variable that takes the following values: values: x 1, x 2, x 3, …, x n. x 1, x 2, x 3, …, x n. Let P(x 1 ), P(x 2 ), P(x 3 ),…,P(x n ) be their respective Let P(x 1 ), P(x 2 ), P(x 3 ),…,P(x n ) be their respective probabilities. Then the expected value of X, E(X), is probabilities. Then the expected value of X, E(X), is defined as defined as E(X) = x 1 P(x 1 ) + x 2 P(x 2 ) + x 3 P(x 3 ) + … + x n P(x n ) E(X) = x 1 P(x 1 ) + x 2 P(x 2 ) + x 3 P(x 3 ) + … + x n P(x n ) E(X) = Σ i x i P(x i ) E(X) = Σ i x i P(x i )
23
23 Exercises Suppose that X is a random variable taking the values {-1, 0, and 1} with equal probabilities and that Y = X 2. Find the joint distribution and the marginal distributions of X and Y and also the conditional distributions of X given a) Y = 0 and b) Y = 1.
24
24 Exercises 01/30 0 Y X 1/32/3 1/3 1/3 1/3 -1 0 1 01 If Y = 0 then X= 0 with probability 1 If Y = 1 then X is equally likely to be +1 or -1
25
25 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions
26
26 Properties of Distributions Measures of Location Mean: Average of observations Mean: Median: Middle observation Example: 9, 11, 12, 13, 13 Median: 12 Mode: The most frequent observation (value with highest prob.) Example: 1, 2, 3, 3, 4, 5, 6 Mode: 3
27
27 Mean The mean is the expected value of X: E[X] = = ∫ x f(x) dx A distribution is uniform when f(x) = 1 and x is between 0 and 1. 0 1 f(x) = 1 What is the expected value of x if it is uniformly distributed?
28
28 Mean 0 1 f(x) = 1 What is the expected value of x if it is uniformly distributed? E[X] = ∫ x dx evaluated from 0 – 1 = ½ x 2 evaluated [0,1] = 1/2
29
29 Properties of Distributions Measures of Location Mode Median Mean
30
30 Properties of Distributions Measures of Dispersion Most popular: Variance Variance = Where S 2 = Σ (x i – mean) 2
31
31 Properties of Distributions Skewness: Measure of symmetry. Skewness: Skewed to the right Skewed to the left Symmetric
32
32 Properties of Distributions Kurtosis: Measure of symmetry. Kurtosis: Low kurtosis High kurtosis
33
33 Correlation Coefficient The correlation coefficient ρ is defined as follows: It is a measure of the (linear) relationship between The variables X and Y. ρ = 1 ρ = 1 ρ = -1 ρ = -1
34
34 Normal Distribution A continuous random variable is normally distributed if its probability density function is where x goes from –infinity to infinity E[X] = μ V[X] = σ 2 μ σ2σ2
35
35 Central Limit Theorem The sum of a large number of independent random variables will be approximately normally distributed almost regardless of their individual distributions.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.