PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

Lecture 7. Distributions
4. Binomial Random Variable Approximations,
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Physics 114: Lecture 9 Probability Density Functions Dale E. Gary NJIT Physics Department.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.
CHAPTER 13: Binomial Distributions
Statistics for Managers Using Microsoft Excel, 5e © 2008 Pearson Prentice-Hall, Inc.Chap 5-1 Statistics for Managers Using Microsoft® Excel 5th Edition.
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
PROBABILITY AND STATISTICS FOR ENGINEERING
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 7 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
R. Kass/S07 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Probability Distribution
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 Let X represent a Binomial r.v,Then from => for large n. In this context, two approximations are extremely useful. (4-1) 4. Binomial Random Variable.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
Modeling and Simulation CS 313
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Independence and Bernoulli.
Theory of Probability Statistics for Business and Economics.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology The Weak Law and the Strong.
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
1 TABLE OF CONTENTS PROBABILITY THEORY Lecture – 1Basics Lecture – 2 Independence and Bernoulli Trials Lecture – 3Random Variables Lecture – 4 Binomial.
Convergence in Distribution
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
THE NORMAL APPROXIMATION TO THE BINOMIAL. Under certain conditions the Normal distribution can be used as an approximation to the Binomial, thus reducing.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
One Function of Two Random Variables
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
THE NORMAL DISTRIBUTION
R. Kass/W04 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Random variables (r.v.) Random variable
STATISTICS AND PROBABILITY IN CIVIL ENGINEERING
Appendix A: Probability Theory
Chapter 5 Sampling Distributions
Chapter 5 Some Important Discrete Probability Distributions
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Distributions and expected value
Chapter 5 Sampling Distributions
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
8. One Function of Two Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Further Topics on Random Variables: 1
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
8. One Function of Two Random Variables
Presentation transcript:

PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology

 Let X represent a Binomial r.v,Then from  for large n. In this context, two approximations are extremely useful.

The Normal Approximation (Demoivre-Laplace Theorem)

 As we know,  If and are within with approximation:  where  We can express this formula in terms of the normalized integral  that has been tabulated extensively.

 A fair coin is tossed 5,000 times.  Find the probability that the number of heads is between 2,475 to 2,525.  We need  Since n is large we can use the normal approximation.  so that and  and  So the approximation is valid for and Example Solution

 Here,  Using the table, Example - continued

The Poisson Approximation  For large n, the Gaussian approximation of a binomial r.v is valid only if p is fixed, i.e., only if and  What if is small, or if it does not increase with n ? -for example when, such that is a fixed number.

The Poisson Theorem  If  Then

The Poisson Approximation

 Thus, the Poisson p.m.f

 Suppose -two million lottery tickets are issued -with 100 winning tickets among them.  a) If a person purchases 100 tickets, what is the probability of winning? Example: Winning a Lottery Solution The probability of buying a winning ticket

 X : number of winning tickets  n : number of purchased tickets,  P : an approximate Poisson distribution with parameter  So, The Probability of winning is: Winning a Lottery - continued

 b) How many tickets should one buy to be 95% confident of having a winning ticket?  we need  But or  Thus one needs to buy about 60,000 tickets to be 95% confident of having a winning ticket! Winning a Lottery - continued Solution

 A space craft has 100,000 components  The probability of any one component being defective is  The mission will be in danger if five or more components become defective.  Find the probability of such an event.  n is large and p is small  Poisson Approximation with parameter Example: Danger in Space Mission Solution

Conditional Probability Density Function

 Further,  Since for

 Toss a coin and X ( T )=0, X ( H )=1.  Suppose  Determine  has the following form.  We need for all x.  For so that  and Example Solution (a) 1 1 (b) 1 1

 For so that  For and Example - continued 1 1

 Given suppose Find  We will first determine  For so that Example Solution

 Thus  and hence Example - continued (a) (b)

 Let B represent the event with  For a given determine and Example Solution

 For we have and hence  For we have and hence  For we have so that  Thus, Example - continued

Conditional p.d.f & Bayes’ Theorem  First, we extend the conditional probability results to random variables:  We know that If is a partition of S and B is an arbitrary event, then:  By setting we obtain:

Conditional p.d.f & Bayes’ Theorem  Using:  We obtain:  For,

Conditional p.d.f & Bayes’ Theorem  Let so that in the limit as  or  we also get  or (Total Probability Theorem)

Bayes’ Theorem (continuous version)  using total probability theorem in  We get the desired result

 probability of obtaining a head in a toss.  For a given coin, a-priori p can possess any value in (0,1).  : A uniform in the absence of any additional information  After tossing the coin n times, k heads are observed.  How can we update this is new information?  Let A = “ k heads in n specific tosses”.  Since these tosses result in a specific sequence,  and using Total Probability Theorem we get Example: Coin Tossing Problem Revisited Solution

 The a-posteriori p.d.f represents the updated information given the event A,  Using  This is a beta distribution.  We can use this a-posteriori p.d.f to make further predictions.  For example, in the light of the above experiment, what can we say about the probability of a head occurring in the next ( n +1)th toss? Example - continued

 Let B = “head occurring in the ( n +1)th toss, given that k heads have occurred in n previous tosses”.  Clearly  From Total Probability Theorem,  Using (1) in (2), we get:  Thus, if n =10, and k = 6, then  which is more realistic compared to p = 0.5. Example - continued