CIS 2033 based on Dekking et al

Slides:



Advertisements
Similar presentations
Exponential Distribution
Advertisements

Yaochen Kuo KAINAN University . SLIDES . BY.
CONTINUOUS RANDOM VARIABLES These are used to define probability models for continuous scale measurements, e.g. distance, weight, time For a large data.
Probability Densities
Review.
Probability Distributions
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33, 5.38, 5.47, 5.53, 5.62.
Chapter 6 Continuous Random Variables and Probability Distributions
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C12: The Poisson process.
Chapter 5 Continuous Random Variables and Probability Distributions
1 Pertemuan 05 Peubah Acak Kontinu dan Fungsi Kepekatannya Matakuliah: I0262 – Statistik Probabilitas Tahun: 2007 Versi: Revisi.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Chapter 4 Continuous Random Variables and Probability Distributions
Chapter 5 Sampling and Statistics Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
JMB Chapter 6 Lecture 3 EGR 252 Spring 2011 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
JMB Ch6 Lecture 3 revised 2 EGR 252 Fall 2011 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
Continuous Random Variables and Probability Distributions
Chapter 3 Basic Concepts in Statistics and Probability
Exponential Distribution
Continuous Probability Distributions  Continuous Random Variable  A random variable whose space (set of possible values) is an entire interval of numbers.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Chapter 6 Continuous Distributions The Gaussian (Normal) Distribution.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Michael Baron. Probability and Statistics for Computer Scientists,
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
MATH 3033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Sean Hercus Instructor Longin Jan Latecki Ch. 6 Simulations.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Chapter 5 Continuous Distributions The Gaussian (Normal) Distribution.
CONTINUOUS RANDOM VARIABLES
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables.
2.Find the turning point of the function given in question 1.
 Recall your experience when you take an elevator.  Think about usually how long it takes for the elevator to arrive.  Most likely, the experience.
Random Variables By: 1.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics B: Michael Baron. Probability and Statistics for Computer Scientists,
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Chapter 6 Continuous Random Variables Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 4 Applied Statistics and Probability for Engineers
Continuous Probability Distributions
MECH 373 Instrumentation and Measurements
Random Variable 2013.
Engineering Probability and Statistics - SE-205 -Chap 4
Continuous Probability Distributions Part 2
Introduction to inference Estimating with confidence
ECO 173 Chapter 10: Introduction to Estimation Lecture 5a
C4: DISCRETE RANDOM VARIABLES
Engineering Probability and Statistics - SE-205 -Chap 3
MTH 161: Introduction To Statistics
Cumulative distribution functions and expected values
CONTINUOUS RANDOM VARIABLES
Chapter 5 Statistical Models in Simulation
Chapter 6 Continuous Probability Distributions
Probability Review for Financial Engineers
MATH 3033 based on Dekking et al
POPULATION (of “units”)
MATH 3033 based on Dekking et al
Continuous Probability Distributions Part 2
Continuous Probability Distributions Part 2
Continuous Probability Distributions Part 2
Continuous Probability Distributions Part 2
CIS 2033 based on Dekking et al
Chapter 6 Continuous Probability Distributions
Chapter 6 Continuous Probability Distributions
Continuous Probability Distributions Part 2
MATH 3033 based on Dekking et al
CIS 2033 based on Dekking et al
Presentation transcript:

CIS 2033 based on Dekking et al CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables

Probability Density Function of X A random variable (RV) X is continuous if for some function ƒ: R  R and for any numbers a and b with a ≤ b, P(a ≤ X ≤ b) = ∫ab ƒ(x) dx The function ƒ has to satisfy ƒ(x) ≥ 0 for all x and ∫-∞∞ ƒ(x) dx = 1. For both continuous and discrete RVs their probabilities map P: Ω -> [0,1], but the pmf of a discrete RV maps px: R -> [0,1] and P(X=a) = px(a), while for a continuous RV P(X=a) = P(a ≤ X ≤ a) = 0, but see next slide.

P(a – ε ≤ X ≤ a +ε) = ∫a-εa+ε ƒ(x) dx ≈ 2ε*ƒ(a) To approximate the probability density function at a point a, one must find an ε that is added and subtracted from a and then the area of the box under the curve is obtained by the following: (2ε) * ƒ(a). As ε approaches zero the area under the curve becomes more precise until one obtains an ε of zero where the area under the curve is that of a width-less box. This is shown through the following equation. P(a – ε ≤ X ≤ a +ε) = ∫a-εa+ε ƒ(x) dx ≈ 2ε*ƒ(a)

Important facts: DISCRETE  NO DENSITY CONTINUOUS  NO MASS BOTH  CUMULATIVE DISTRIBUTION ƒ(a) = P(X ≤ a) P(a < X ≤ b) = P(X ≤ b) – P(X ≤ a) = ƒ(b) – ƒ(a) How the Distribution Function relates to the Density Function: Important facts:

Uniform Distribution U(α,β) A continuous RV has a uniform distribution on the interval [α,β] if its pdf ƒ is given by ƒ(x) = 0 if x is not in [α,β] and, ƒ(x) = 1/(β-α) for α ≤ x ≤ β This simply means that for any x in the interval of alpha to beta has the same probability and anything not in the interval is zero. The cumulative distribution is given by F(x) = 0 if x < α, F(x) = 1 if x > β, and F(x) = (x − α)/(β − α) for α ≤ x ≤ β If U is normalized, i.e., U(0,1), then F(x)=P(U<x)=x for every x.

Exponential Distribution Exp(λ) Intuitively: a continuous version of the geometric distribution. A continuous random variable has an exponential distribution with parameter λ if its probability density function ƒ is given by ƒ(x) = λe-λx for x ≥ 0 The Distribution function ƒ of an Exp(λ) distribution is given by ƒ(a) = 1 – e-λa for a ≥ 0 P(X > s + t | x > s) = P(x > s + t)/P(x>s) = (e-λ(s+t))/(e-λs) =e-λt= P(X > t) This simply means that s becomes the origin where t.

Exponential Distribution Exp(λ) An RV X has an expondential distribution if its pdf: ƒ(x) = λe-λx for x ≥ 0 and CDF: ƒ(a) = 1 – e-λa for a ≥ 0 The parameter λ is the inverse of the expected value of X, i.e., E(X)= 1/ λ or λ=1/E(x). If X is time in minutes, then λ is a frequency measured in min-1. For example, if arrivals occur every half a minute on average, then E(X)=0.5 and λ=2, saying that they occur with a frequency (arrival rate) of 2 arrivals per minute. This λ has the same meaning as the parameter of Poisson distribution.

How long do we need to wait before a customer enters our shop? ƒ(x) = λe-λx for x ≥ 0 ƒ(a) = 1 – e-λa for a ≥ 0 How much time will elapse before an earthquake occurs in a given region? How long do we need to wait before a customer enters our shop? How long will it take before a call center receives the next phone call? How long will a piece of machinery work without breaking down? Questions such as these are often answered in probabilistic terms using the exponential distribution. All these questions concern the time we need to wait before a certain event occurs. If this waiting time is unknown, it is often appropriate to think of it as a random variable having an exponential distribution.

Roughly speaking, the time we need to wait before an event occurs has an exponential distribution if the probability that the event occurs during a certain time interval is proportional to the length of that time interval. More precisely, X has an exponential distribution if the conditional probability is approximately proportional to the length Δt of the time interval [t, t + Δt] for any time instant t . In most practical situations this property is very realistic and this is the reason why the exponential distribution is so widely used to model waiting times. From: http://www.statlect.com/ucdexp1.htm

Pareto Distribution Par(α) Used for estimating real-life situations such as the number of people whose income exceeded level x, city sizes, earthquake rupture areas, insurance claims, and sizes of commercial companies. A continuous random variable has a Pareto distribution with parameter α > 0 if its probability density function ƒ is given by ƒ(x) = 0 if x < 1 and for x ≥ 1

Normal Distribution N(μ,σ2) Normal Distribution (Gaussian Distribution) with parameters μ and σ2 > 0 if its probability density function ƒ is given by for -∞ < x < ∞ *Where μ = mean and σ2 = standard deviation* Distribution function is given by: for -∞ < a < ∞ However, since ƒ does not have an antiderivative there is no explicit expression for Ƒ. Therefore standard normal distribution where N(0,1) is given as follows, and the distribution function is obtained similarly denoted by capital phi.

Normal Distribution

Quantiles Portions of the whole which increase from left to right, meaning the 0th percentile is on the left hand side and the 100th percentile is on the right side. Let X be a continuous random variable and let p be a number between 0 and 1. The pth quantile or 100pth percentile of the distribution of X is the smallest number qp such that Ƒ(qp) = P(X ≤ qp) = p The median is the 50th percentile

Ƒ(qp) = P(X ≤ qp) = p For continuous random variables qp is often easy to determine. If F is strictly increasing from 0 to 1 on some interval (which may be infinite to one or both sides), then qp = Finv(p), where Finv is the inverse of F.

What is the median of the U(2, 7) distribution? The median is the number q0.5 = Finv(0.5). You either see directly that you have got half of the mass to both sides of the middle of the interval, hence q0.5 = (2+7)/2 = 4.5, or you solve with the distribution function: F(qp) = P(X ≤ qp) = p = 0.5 F(q) = 0.5 and F(x) = (x − α)/(β − α) 0.5

F(qp) = P(X ≤ qp) = p = 0.9 F(a) = 0.9 Matlab: a = norminv(0.9,0,1) => a =1.2816