ASV Chapters 1 - Sample Spaces and Probabilities

Slides:



Advertisements
Similar presentations
Week 91 Example A device containing two key components fails when and only when both components fail. The lifetime, T 1 and T 2, of these components are.
Advertisements

CHAPTER Discrete Models  G eneral distributions  C lassical: Binomial, Poisson, etc Continuous Models  G eneral distributions  C.
Exponential Distribution. = mean interval between consequent events = rate = mean number of counts in the unit interval > 0 X = distance between events.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Probability Densities
Chapter 6 Continuous Random Variables and Probability Distributions
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Discrete Random Variables Chapter 4.
Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these.
Topic 4 - Continuous distributions
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
DATA ANALYSIS Module Code: CA660 Lecture Block 3.
Chapter 4 Continuous Random Variables and their Probability Distributions The Theoretical Continuous Distributions starring The Rectangular The Normal.
Chapter 5 Statistical Models in Simulation
Chapter 3 Basic Concepts in Statistics and Probability
Moment Generating Functions
Mid-Term Review Final Review Statistical for Business (1)(2)
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
CHAPTER Discrete Models  G eneral distributions  C lassical: Binomial, Poisson, etc Continuous Models  G eneral distributions 
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 5 Discrete Random Variables.
1 Since everything is a reflection of our minds, everything can be changed by our minds.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Chapter 4 Continuous Random Variables and Probability Distributions  Probability Density Functions.2 - Cumulative Distribution Functions and E Expected.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Chapter 4 Continuous Random Variables and Probability Distributions  Probability Density Functions.2 - Cumulative Distribution Functions and E Expected.
Chap 5-1 Chapter 5 Discrete Random Variables and Probability Distributions Statistics for Business and Economics 6 th Edition.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Chapter 4 Applied Statistics and Probability for Engineers
Expectations of Random Variables, Functions of Random Variables
MECH 373 Instrumentation and Measurements
ASV Chapters 1 - Sample Spaces and Probabilities
Supplemental Lecture Notes
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Chapter 4 Continuous Random Variables and Probability Distributions
Engineering Probability and Statistics - SE-205 -Chap 4
The Exponential and Gamma Distributions
Chapter 4 Continuous Random Variables and Probability Distributions
Math 4030 – 10b Inferences Concerning Variances: Hypothesis Testing
Appendix A: Probability Theory
Chapter 5 Joint Probability Distributions and Random Samples
Discrete random variable X Examples: shoe size, dosage (mg), # cells,…
Sample Mean Distributions
Chapter 3 Discrete Random Variables and Probability Distributions
Chapter 7: Sampling Distributions
Chapter 4 Continuous Random Variables and Probability Distributions
Multinomial Distribution
Classical Continuous Probability Distributions
ASV Chapters 1 - Sample Spaces and Probabilities
More about Normal Distributions
Discrete random variable X Examples: shoe size, dosage (mg), # cells,…
ASV Chapters 1 - Sample Spaces and Probabilities
Econometric Models The most basic econometric model consists of a relationship between two variables which is disturbed by a random error. We need to use.
Example Suppose X ~ Uniform(2, 4). Let . Find .
ASV Chapters 1 - Sample Spaces and Probabilities
ASV Chapters 1 - Sample Spaces and Probabilities
ASV Chapters 1 - Sample Spaces and Probabilities
Chapter 3 Discrete Random Variables and Probability Distributions
Functions of Random variables
ASV Chapters 1 - Sample Spaces and Probabilities
ASV Chapters 1 - Sample Spaces and Probabilities
ASV Chapters 1 - Sample Spaces and Probabilities
Berlin Chen Department of Computer Science & Information Engineering
Continuous Distributions
Moments of Random Variables
Presentation transcript:

ASV Chapters 1 - Sample Spaces and Probabilities 2 - Conditional Probability and Independence 3 - Random Variables 4 - Approximations of the Binomial Distribution 5 - Transforms and Transformations 6 - Joint Distribution of Random Variables 7 - Sums and Symmetry 8 - Expectation and Variance in the Multivariate Setting 9 - Tail Bounds and Limit Theorems 10 - Conditional Distribution 11 - Appendix A, B, C, D, E, F

Poisson Distribution (discrete) For x = 0, 1, 2, …, this calculates P(x Events) in a random sample of n trials coming from a population with rare P(Event) = . But it may also be used to calculate P(x Events) within a random interval of time units, for a “Poisson process” having a known “Poisson rate” α. Recall… T X = # “clicks” on a Geiger counter in normal background radiation.

Poisson Distribution (discrete) For x = 0, 1, 2, …, this calculates P(x Events) in a random sample of n trials coming from a population with rare P(Event) = . But it may also be used to calculate P(x Events) within a random interval of time units, for a “Poisson process” having a known “Poisson rate” α. T X = time between “clicks” on a Geiger counter in normal background radiation. X = # “clicks” on a Geiger counter in normal background radiation. “Time-to-Event Analysis” “Time-to-Failure Analysis” “Reliability Analysis” “Survival Analysis” failures, deaths, births, etc. Time between events is often modeled by the Exponential Distribution (continuous).

Time between events is often modeled by the Exponential Distribution (continuous). X ~ Exp() parameter  > 0 Check pdf?   X = Time between events

Time between events is often modeled by the Exponential Distribution (continuous). X ~ Exp() parameter  > 0 Calculate the expected time between events X = Time between events

Time between events is often modeled by the Exponential Distribution (continuous). X ~ Exp() parameter  > 0 Calculate the expected time between events Similarly for the variance… etc... = X = Time between events

Time between events is often modeled by the Exponential Distribution (continuous). X ~ Exp() parameter  > 0 Calculate the expected time between events Determine the cdf X = Time between events

Calculate the expected time between events Time between events is often modeled by the Exponential Distribution (continuous). X ~ Exp() parameter  > 0 Calculate the expected time between events Determine the cdf Note: “Reliability Function” R(t) “Survival Function” S(t) X = Time between events

Time between events is often modeled by the Exponential Distribution (continuous). X ~ Exp() parameter  > 0 Example: Suppose mean time between events is known to be… = 2 years Then for x  0, Calculate Calculate the “Poisson rate” . X = Time between events

Poisson Distribution (discrete) For x = 0, 1, 2, …, this calculates P(x Events) in a random sample of n trials coming from a population with rare P(Event) = . But it may also be used to calculate P(x Events) within a random interval of time units, for a “Poisson process” having a known “Poisson rate” α. T The mean number of events during this time interval (0, T) is . Therefore, the mean number of events in one unit of time is . X = Time between events is often modeled by the Exponential Distribution (continuous). Connection? However, the mean time between events was just shown to be = . Ex: Suppose the mean number of instantaneous clicks/sec is  = 10, then the mean time between any two successive clicks is  = 1/10 sec. Ex: Suppose the mean number of instantaneous clicks/sec is  = 10, then the mean time between any two successive clicks is  = 1/10 sec. 1 second

Time between events is often modeled by the Exponential Distribution (continuous). X ~ Exp() parameter  > 0 Example: Suppose mean time between events is known to be… = 2 years Then for x  0, Calculate Calculate the “Poisson rate” . X = Time between events

independent of time t; only depends on t Another property … (Event = “Failure,” etc.) T No Failure What is the probability of “No Failure” up to t +  t, given “No Failure” up to t? independent of time t; only depends on t “Memory-less” property of the Exponential distribution The conditional property of “no failure” from ANY time t to a future time t + t of fixed duration t, remains constant. Models many systems in the “prime of their lives,” e.g., a random 30-yr old individual in the USA.

The Gamma Distribution More general models exist…, e.g., The Gamma Distribution In order to understand this, it is first necessary to understand the ”Gamma Function” Def: For any  > 0, Discovered by Swiss mathematician Leonhard Euler (1707-1783) in a different form. “Special Functions of Mathematical Physics” includes Gamma, Beta, Bessel, classical orthogonal polynomials (Jacobi, Chebyshev, Legendre, Hermite,…), etc. Generalization of “factorials” to all complex values of  (except 0, -1, -2, -3, …). The Exponential distribution is a special case of the Gamma distribution! Basic Properties:  Proof: Proof: Let  = n = 1, 2, 3, … 

The Gamma Function

 General Gamma Distribution Gamma Function  = “shape parameter”  = “scale parameter” Exponential Distribution Note that if  = 1, then pdf Note that if  = 1, then pdf  Standard Gamma Distribution

General Gamma Distribution Standard Gamma Distribution WLOG… General Gamma Distribution Gamma Function  = “shape parameter” Standard Gamma Distribution

Standard Gamma Distribution General Gamma Distribution WLOG… Standard Gamma Distribution General Gamma Distribution Gamma Function  = “shape parameter”

Standard Gamma Distribution Gamma Function  = “shape parameter”

Standard Gamma Distribution “Incomplete Gamma Function”  = “shape parameter” “Incomplete Gamma Function” (No general closed form expression, but still continuous and monotonic from 0 to 1.)

General Gamma Distribution Exponential Distribution Return to… Gamma Function  = “shape parameter”  = “scale parameter” Exponential Distribution Note that if  = 1, then “Poisson rate”  = 1/ =  “independent, identically distributed” (i.i.d.) Theorem: Suppose r.v.’s Then their sum e.g., failure time in machine components

General Gamma Distribution Gamma Function  = “shape parameter”  = “scale parameter” Example: Suppose X = time between failures is known to be modeled by a Gamma distribution, with mean = 8 years, and standard deviation = 4 years. Calculate the probability of failure before 5 years.

General Gamma Distribution Gamma Function  = “shape parameter”  = “scale parameter” Example: Suppose X = time between failures is known to be modeled by a Gamma distribution, with mean = 8 years, and standard deviation = 4 years. Calculate the probability of failure before 5 years. 5.6 3

Chi-Squared Distribution with  = n  1 degrees of freedom df = 1, 2, 3,…  = 1  = 2  = 3  = 4  = 5  = 6  = 7 Special case of the Gamma distribution: “Chi-squared Test” used in statistical analysis of categorical data.

with degrees of freedom 1 and 2 . F-distribution with degrees of freedom 1 and 2 . “F-Test” used when comparing means of two or more groups (ANOVA).

with (n – 1) degrees of freedom df = 1, 2, 3, … T-distribution with (n – 1) degrees of freedom df = 1, 2, 3, … df = 1 df = 2 df = 5 df = 10 “T-Test” used when analyzing means of one or two groups.

T-distribution with 1 degree of freedom “Cauchy distribution” df = 1

 T-distribution “Cauchy distribution” with 1 degree of freedom improper integral at both endpoints 

 T-distribution “Cauchy distribution” with 1 degree of freedom improper integral at both endpoints 

“Cauchy distribution” T-distribution with 1 degree of freedom “Cauchy distribution” improper integral at both endpoints “indeterminate form”

 does not exist! “indeterminate form” T-distribution with 1 degree of freedom “Cauchy distribution” improper integral at both endpoints  does not exist! “indeterminate form”

Classical Continuous Probability Distributions Normal distribution Log-Normal ~ X is not normally distributed (e.g., skewed), but Y = “logarithm of X” is normally distributed Student’s t-distribution ~ Similar to normal distr, more flexible F-distribution ~ Used when comparing multiple group means Chi-squared distribution ~ Used extensively in categorical data analysis Others for specialized applications ~ Gamma, Beta, Weibull…