Download presentation
Presentation is loading. Please wait.
Published byGwenda George Modified over 6 years ago
1
Engineering Probability and Statistics - SE-205 -Chap 3
By S. O. Duffuaa
2
Lecture Objectives Present the following: Concept of random variables
Probability distributions Probability mass function
3
Random Experiment and random variables
Throwing a coin S = { H, T}. Define a mapping X: { H, T} R X(H) = 1 and X(T) = 0, Also the probability of 1 and 0 are the same as for H and T. Then we call X a random variable.
4
Random Experiment and random variables
In the experiment on the number of defective parts in three parts the sample space S = { 0, 1, 2, 3} Find P(0), P(1), P(2) and P(3) P(0) = 1/8, P(1) = 3/8, P(2) = 3/8 and P(3) = 1/8
5
Probability Mass Function
X o 1 2 3 f(x) 1/8 3/8 3/8 1/8 Properties of f(x) f(x) 0 f(x) = 1 Give many examples in class
6
Probability Mass Function
Build the probability mass functions for the following random variables: Number of traffic accidents per month on campus. Class grade distribution Number of “ F” in SE 205 class per semester Number of students that register for SE 205 every semester.
7
Cumulative Distribution Function
It is a function that provide the cumulative probability up to a point for a random variable (r.v). Defined as follows for a discrete r.v: P( X x) = F(x) = f(t) t x
8
Cumulative Distribution Function (CDF)
Example of a cumulative distribution function F(x) = x -2 = x 0 = x 2 = x What is the mass function for the above F(x). Note you need to subtract
9
Probability Mass Function Corresponding to Previous CDF
X -2 2 f(x) 0.2 0.5 0.3 The above mass function is the one corresponding to the previous CDF is
10
Mean /Expected Value of a Discrete Random Variable (r.v)
The mean of a discrete r.v denoted as E(X) also called the expected value is given as: E(X) = μ = x f(x) The expected value provides a good idea a bout the center of the r.v. compute the mean of the r.v in previous slide: E(X) = (-2) (0.2) + (0) (0.5) + (2)(0.3) = 0.2 x
11
Variance of A Random Variable
The variance is a measure of variability. What is variability? The variance is defined as: V(X) =σ2 = E(X-μ)2 = (x-μ)2f(x) Compute the variance of the r.v in the slide before the previous one. σ2 = (-2-0.2)2 (0.2) + (0-0.2)2(0.5) + (2-0.2)2(0.3) = Also see example 3-9 and 3-11in the text.
12
Expected Value of a Function of a r.v
Let X be a r.v with p.m.f f(x) and let h(X) is a function of X. Then the expected value of h(X) is given as: E(H(X)) = h(x) f(x) Compute the expected value of h(X) = X2 - X for the r.v in the previous slides. See example 3-12 in text book. x
13
Discrete Random Variables
In this section will study several discrete distributions. For each distribution the student must be familiar with the following about each distribution: Range and probability mass function Cumulative distribution function Mean and variance 2-3 applications
14
Discrete Random Variables
The following distributions will be studied Discrete uniform Bernoulli Binomial Hyper-geometric Geometric Poisson
15
Discrete Uniform A random variable is discrete uniform if every point in its range has the same probability. If there are n points in the range, then the probability of each point is f(x) = 1/n An alternative way of defining uniform as follows: Suppose the rang is a, a+1, a+2, … b The number of points is (b-a+1) f(x) = 1/(b-a+1) for x = a, a+1, a+2, …, b
16
Discrete Uniform The CDF F(x) you just multiply by the number of points less than or equal to x The mean of the uniform is (b+a)/2 The variance of it is [(b-a+1)2 – 1]/12 Applied to following situations: Random number generation Drawing a random sample Situation where vales have equal probabilities.
17
Bernoulli Trials A trial with only two possible outcomes is used so frequently as a building block of a random experiment that it is called a Bernoulli trial. It is usually assumed that the trials that constitute the random experiment are independent. This implies that the outcome from one trial has no effect on the outcome to be obtained from any other trial. Furthermore, it is often reasonable to assume that the probability of a success in each trial is constant.
18
Bernoulli Trials If we denote success by 1 an failure by 0, then the probability mass function f(x) is given as: f(1) = p and f(0) = 1-p = q, as you see the range is 0 and 1 F(x) simple Mean =E(X) = p Variance = σ2 = p(1-p) = pq Applications: Building block for other distributions Experiment with two outcomes
19
Binomial Random Variable
A random experiment consisting of n repeated trails such that the trials are independent, each trial results in only two possible outcomes, labeled as “success” and “failure”, and the probability of a success in each trial, denoted as p, remeins constant is called a binomial experiment. The random variable X that equals the number of trials that result in a success has binomial distribution with parameters p and n = 1, 2, …. The probability mass function of X is
20
Binomial Random Variable
0.18 0.15 0.12 0.09 0.06 0.03 f(x) x x 0.4 0.3 0.2 0.1 f(x) n p n p Figure 4-6 Binomial distributions for selected values of n and p
21
Binomial Random Variable
If X is a binomial random variable with parameters p and n, then and Applications: Design of sampling plans for quality control Estimation of product defects.
22
Geometric Random Variable
In a series of independent Bernoulli trials, with constant probability p of a success, let the random variable X denote the number of trials until the first success. Then X has a geometric distribution with parameters p and
23
Geometric Distribution
If X is a geometric random variable with parameters p, then the mean and variance and Applications: Quality control, design of control charts Estimation
24
Hyper-Geometric Distribution
A set of N objects contains K objects classified as successes and N – K objects classified as failures A sample of size of n objects is selected, at random (without replacement) from the N objects, where K N and n N Let the random variable X denote the number of successes in the sample. Then X has a hypergeometric distribution and
25
Hyper-Geometric Distribution
If X is a hypergeometric random variable with parameters N, K and n, then the mean and variance of X are and where p = K/N Applications: Design of inspection plans for quality control Design of control charts
26
Poisson Random Variable
Given an interval of real numbers, assume counts occur at random throughout the interval. If the interval can be partitioned into subintervals of small enough length such that the probability of more than one count in a subintervals is zero, the probability of one count in a subinterval is the same for all subintervals and proportional to the length of the subinterval, and the count in each subinterval is independent of other subintervals, then the random experiment is called a Poisson process If the mean number of counts in the interval is > 0, the random variable X that equals the number of counts in the interval has a Poisson distribution with parameters , and the probability mass function of X is
27
Poisson Random Variable
If X is a Poisson random variable with parameters , then the mean and variance of X are and Applications: Model number of arrivals to a service facility Model number of accidents per month Demand for spare parts per month
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.