1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.

Slides:



Advertisements
Similar presentations
Week 10 Generalised functions, or distributions
Advertisements

9. Two Functions of Two Random Variables
CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Ch 3.6: Variation of Parameters
Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.
6. Mean, Variance, Moments and Characteristic Functions
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Ch 5.1: Review of Power Series
Ch 5.1: Review of Power Series Finding the general solution of a linear differential equation depends on determining a fundamental set of solutions of.
15. Poisson Processes In Lecture 4, we introduced Poisson arrivals as the limiting behavior of Binomial random variables. (Refer to Poisson approximation.
19. Series Representation of Stochastic Processes
Ch 3.3: Linear Independence and the Wronskian
511 Friday March 30, 2001 Math/Stat 511 R. Sharpley Lecture #27: a. Verification of the derivation of the gamma random variable b.Begin the standard normal.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
LIAL HORNSBY SCHNEIDER
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Algebra Problems… Solutions Algebra Problems… Solutions © 2007 Herbert I. Gross Set 20 By Herbert I. Gross and Richard A. Medeiros next.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 Let X represent a Binomial r.v,Then from => for large n. In this context, two approximations are extremely useful. (4-1) 4. Binomial Random Variable.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
1 TABLE OF CONTENTS PROBABILITY THEORY Lecture – 1Basics Lecture – 2 Independence and Bernoulli Trials Lecture – 3Random Variables Lecture – 4 Binomial.
Boyce/DiPrima 9 th ed, Ch 5.1: Review of Power Series Elementary Differential Equations and Boundary Value Problems, 9 th edition, by William E. Boyce.
1 One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint p.d.f.
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Lecture 22 Numerical Analysis. Chapter 5 Interpolation.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
FRIDAY, FEBRUARY 13, SOLVING TRIG EQUATIONS.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Math 3120 Differential Equations with Boundary Value Problems
Joint Moments and Joint Characteristic Functions.
One Function of Two Random Variables
Boyce/DiPrima 9 th ed, Ch 11.3: Non- Homogeneous Boundary Value Problems Elementary Differential Equations and Boundary Value Problems, 9 th edition, by.
Finite Element Method. History Application Consider the two point boundary value problem.
Ordinary differential equations - ODE An n-th order ordinary differential equation (ODE n ) is an equation where is a known function in n + 1 variables.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
3. Random Variables (Fig.3.1)
12. Principles of Parameter Estimation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
5. Functions of a Random Variable
5. Functions of a Random Variable
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
Further Topics on Random Variables: Derived Distributions
16. Mean Square Estimation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Further Topics on Random Variables: Derived Distributions
8. One Function of Two Random Variables
Further Topics on Random Variables: Derived Distributions
Presentation transcript:

1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If so what is its PDF pdf Clearly if Y is a r.v, then for every set B, the set of for which must belong to F. In that case if X is a r.v, so is Y, and for every set B (5-1) ) PILLAI (5-2

2 In particular Thus the distribution function as well of the density function of Y can be determined in terms of that of X. To obtain the distribution function of Y, we must determine the set on the x-axis such that for every given y, and the probability of that set. At this point, we shall consider some of the following functions to illustrate the technical details. (5-3) PILLAI

3 Example 5.1: Solution: Suppose and On the other hand if then and hence (5-4) (5-5) (5-6) (5-7) (5-8) PILLAI

4 From (5-6) and (5-8), we obtain (for all a) Example 5.2: If then the event and hence For from Fig. 5.1, the event is equivalent to (5-9) (5-10) (5-11) (5-12) Fig. 5.1 PILLAI

5 Hence By direct differentiation, we get If represents an even function, then (5-14) reduces to In particular if  so that (5-14) (5-15) (5-16) (5-13) PILLAI

6 and substituting this into (5-14) or (5-15), we obtain the p.d.f of to be PILLAI

7 for where is of continuous type. However, if is a continuous function, it is easy to establish a direct procedure to obtain A continuos function g(x) with nonzero, has only a finite number of maxima and minima, and it eventually becomes monotonic as Consider a specific y on the y-axis, and a positive increment as shown in Fig. 5.4 Fig. 5.4 PILLAI

8 Using (3-28) we can write But the event can be expressed in terms of as well. To see this, referring back to Fig. 5.4, we notice that the equation has three solutions (for the specific y chosen there). As a result when the r.v X could be in any one of the three mutually exclusive intervals Hence the probability of the event in (5-26) is the sum of the probability of the above three events, i.e., (5-26) (5-27) PILLAI

9 For small making use of the approximation in (5-26), we get In this case, and so that (5-28) can be rewritten as and as (5-29) can be expressed as The summation index i in (5-30) depends on y, and for every y the equation must be solved to obtain the total number of solutions at every y, and the actual solutions all in terms of y. (5-28) (5-29) (5-30) PILLAI

10 For example, if then for all and represent the two solutions for each y. Notice that the solutions are all in terms of y so that the right side of (5-30) is only a function of y. Referring back to the example (Example 5.2) here for each there are two solutions given by and ( for ). Moreover and using (5-30) we get which agrees with (5-14). (5-31) Fig. 5.5 PILLAI

11 Example 5.5: Find Solution: Here for every y, is the only solution, and and substituting this into (5-30), we obtain In particular, suppose X is a Cauchy r.v as in (3-39) with parameter so that In that case from (5-33), has the p.d.f (5-33) (5-32) (5-34) (5-35) PILLAI

12 But (5-35) represents the p.d.f of a Cauchy r.v with parameter Thus if  then  Example 5.6: Suppose and Determine Solution: Since X has zero probability of falling outside the interval has zero probability of falling outside the interval Clearly outside this interval. For any from Fig.5.6(b), the equation has an infinite number of solutions where is the principal solution. Moreover, using the symmetry we also get etc. Further, so that PILLAI

13 Using this in (5-30), we obtain for But from Fig. 5.6(a), in this case (Except for and the rest are all zeros). (5-36) (a) (b) Fig. 5.6 PILLAI

14 Thus (Fig. 5.7) Example 5.7: Let where  Determine Solution: As x moves from y moves from From Fig.5.8(b), the function is one-to-one for For any y, is the principal solution. Further (5-37) Fig. 5.7 PILLAI

15 so that using (5-30) which represents a Cauchy density function with parameter equal to unity (Fig. 5.9). (5-38) (a) (b) Fig. 5.8 Fig. 5.9 PILLAI

16 Functions of a discrete-type r.v Suppose X is a discrete-type r.v with and Clearly Y is also of discrete-type, and when and for those Example 5.8: Suppose  so that Define Find the p.m.f of Y. Solution: X takes the values so that Y only takes the value and (5-39) (5-40) (5-41) PILLAI

17 so that for (5-42) PILLAI