Download presentation
Presentation is loading. Please wait.
Published bySpencer Anthony Modified over 9 years ago
1
Probability theory 2 Tron Anders Moger September 13th 2006
2
The Binomial distribution Bernoulli distribution: One experiment with two possible outcomes, probability of success P. If the experiment is repeated n times The probability P is constant in all experiments The experiments are independent Then the number of successes follows a binomial distribution
3
The Binomial distribution If X has a Binomial distribution, its PDF is defined as:
4
Example Since the early 50s, 10000 UFO’s have been reported in the U.S. Assume P(real observation)=1/100000 Binomial experiments, n=10000, p=1/100000 X counts the number of real observations
5
The Hypergeometric distribution Randomly sample n objects from a group of N, S of which are successes. The distribution of the number of successes, X, in the sample, is hypergeometric distributed:
6
Example What is the probability of winning the lottery, that is, getting all 7 numbers on your coupon correct out of the total 34?
7
The distribution of rare events: The Poisson distribution Assume successes happen independently, at a rate λ per time unit. The probability of x successes during a time unit is given by the Poisson distribution:
8
Example: AIDS cases in 1991 (47 weeks) Cases per week: 1 1 0 1 2 1 3 0 0 0 0 0 0 2 1 2 2 1 3 0 1 0 0 0 1 1 1 1 1 0 2 1 0 2 0 2 1 6 1 0 0 1 0 2 0 0 0 Mean number of cases per week: λ=44/47=0.936 Can model the data as a Poisson process with rate λ=0.936
9
Example cont’d: No. ofNo.Expected no. observed casesobserved(from Poisson dist.) 02018.4 11617.2 288.1 322.5 400.6 500.11 610.017 Calculation: P(X=2)=0.936 2 *e -0.936 /2!=0.17 Multiply by the number of weeks: 0.17*47=8.1 Poisson distribution fits data fairly well!
10
The Poisson and the Binomial Assume X is Bin(n,P), E(X)=nP Probability of 0 successes: P(X=0)=(1-p) n Can write λ =nP, hence P(X=0)=(1- λ/n) n If n is large and P is small, this converges to e -λ, the probability of 0 successes in a Poisson distribution! Can show that this also applies for other probabilities. Hence, Poisson approximates Binomial when n is large and P is small (n>5, P<0.05).
11
Bivariate distributions If X and Y is a pair of discrete random variables, their joint probability function expresses the probability that they simultaneously take specific values: – –marginal probability: –conditional probability: –X and Y are independent if for all x and y:
12
Example The probabilities for –A: Rain tomorrow –B: Wind tomorrow are given in the following table: 0.10.20.050.01 0.050.10.150.04 0.050.1 0.05 No rain Light rain Heavy rain No wind Some windStrong windStorm
13
Example cont’d: Marginal probability of no rain: 0.1+0.2+0.05+0.01=0.36 Similarily, marg. prob. of light and heavy rain: 0.34 and 0.3. Hence marginal dist. of rain is a PDF! Conditional probability of no rain given storm: 0.01/(0.01+0.04+0.05)=0.1 Similarily, cond. prob. of light and heavy rain given storm: 0.4 and 0.5. Hence conditional dist. of rain given storm is a PDF ! Are rain and wind independent? Marg. prob. of no wind: 0.1+0.05+0.05=0.2 P(no rain,no wind)=0.36*0.2=0.072≠0.1
14
Covariance and correlation Covariance measures how two variables vary together: Correlation is always between -1 and 1: If X,Y independent, then If Cov(X,Y)=0 then
15
Continuous random variables Used when the outcomes can take any number (with decimals) on a scale Probabilities are assigned to intervals of numbers; individual numbers generally have probability zero Area under a curve: Integrals
16
Cdf for continuous random variables As before, the cumulative distribution function F(x) is equal to the probability of all outcomes less than or equal to x. Thus we get The probability density function is however now defined so that We get that
17
Expected values The expectation of a continuous random variable X is defined as The variance, standard deviation, covariance, and correlation are defined exactly as before, in terms of the expectation, and thus have the same properties
18
Example: The uniform distribution on the interval [0,1] f(x)=1 F(x)=x
19
The normal distribution The most used continuous probability distribution: –Many observations tend to approximately follow this distribution –It is easy and nice to do computations with –BUT: Using it can result in wrong conclusions when it is not appropriate
20
Histogram of weight with normal curve displayed
21
The normal distribution The probability density function is where Notation Standard normal distribution Using the normal density is often OK unless the actual distribution is very skewed Also: µ±σ covers ca 65% of the distribution µ±2σ covers ca 95% of the distribution
22
The normal distribution with small and large standard deviation σ
23
Simple method for checking if data are well approximated by a normal distribution: Explore As before, choose Analyze->Descriptive Statistics->Explore in SPSS. Move the variable to Dependent List (e.g. weight). Under Plots, check Normality Plots with tests.
24
Histogram of lung function for the students
25
Q-Q plot for lung function
26
Age – not normal
27
Q-Q plot of age
28
Skewed distribution, with e.g. the observations 0.40, 0.96, 11.0 A trick for data that are skewed to the right: Log-transformation!
29
Log-transformed data ln(0.40)=-0.91 ln(0.96)=-0.04 ln(11) =2.40 Do the analysis on log- transformed data SPSS: transform- compute
30
OK, the data follows a normal distribution, so what? First lecture, pairs of terms: –Sample – population –Histogram – distribution –Mean – Expected value In statistics we would like the results from analyzing a small sample to apply for the population Has to collect a sample that is representative w.r.t. age, gender, home place etc.
31
New way of reading tables and histograms: Histograms show that data can be described by a normal distribution Want to conclude that data in the population are normally distributed Mean calculated from the sample is an estimate of the expected value µ of the population normal distribution Standard deviation in the sample is an estimate of σ in the population normal distribution Mean±2*(standard deviation) as estimated from the sample (hopefully) covers 95% of the population normal distribution
32
In addition: Most standard methods for analyzing continuous data assumes a normal distribution. When n is large and P is not too close to 0 or 1, the Binomial distribution can be approximated by the normal distribution A similar phenomenon is true for the Poisson distribution This is a phenomenon that happens for all distributions that can be seen as a sum of independent observations. Means that the normal distribution appears whenever you want to do statistics
33
The Exponential distribution The exponential distribution is a distribution for positive numbers (parameter λ): It can be used to model the time until an event, when events arrive randomly at a constant rate
34
Next time: Sampling and estimation Will talk much more in depth about the topics mentioned in the last few slides today
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.