Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.

Slides:



Advertisements
Similar presentations
MGMT 242 Spring, 1999 Random Variables and Probability Distributions Chapter 4 “Never draw to an inside straight.” from Maxims Learned at My Mother’s Knee.
Advertisements

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Independence of random variables
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
Introduction to stochastic process
Probability Densities
Point estimation, interval estimation
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Class notes for ISE 201 San Jose State University
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
CONTINUOUS RANDOM VARIABLES. Continuous random variables have values in a “continuum” of real numbers Examples -- X = How far you will hit a golf ball.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
4. Review of Basic Probability and Statistics
OMS 201 Review. Range The range of a data set is the difference between the largest and smallest data values. It is the simplest measure of dispersion.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Chapter 4: Joint and Conditional Distributions
Review of Probability and Statistics
1A.1 Copyright© 1977 John Wiley & Son, Inc. All rights reserved Review Some Basic Statistical Concepts Appendix 1A.
Random Variable and Probability Distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Econ 482 Lecture 1 I. Administration: Introduction Syllabus Thursday, Jan 16 th, “Lab” class is from 5-6pm in Savery 117 II. Material: Start of Statistical.
Review of Probability.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
Probability theory 2 Tron Anders Moger September 13th 2006.
Moment Generating Functions
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Chapter 3 Random vectors and their numerical characteristics.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Review of Probability Concepts ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Continuous Distributions The Uniform distribution from a to b.
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Statistics for Business & Economics
Continuous Probability Distributions Statistics for Management and Economics Chapter 8.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Operations on Multiple Random Variables
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Multiple Discrete Random Variables. Introduction Consider the choice of a student at random from a population. We wish to know student’s height, weight,
Math 4030 – 6a Joint Distributions (Discrete)
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
President UniversityErwin SitompulPBST 3/1 Dr.-Ing. Erwin Sitompul President University Lecture 3 Probability and Statistics
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
CSE 474 Simulation Modeling | MUSHFIQUR ROUF CSE474:
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Pattern Recognition Mathematic Review Hamid R. Rabiee Jafar Muhammadi Ali Jalali.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Pattern Recognition Mathematic Review Hamid R. Rabiee Jafar Muhammadi Ali Jalali.
STATISTICS People sometimes use statistics to describe the results of an experiment or an investigation. This process is referred to as data analysis or.
Fractiles Given a probability distribution F(x) and a number p, we define a p-fractile x p by the following formulas.
Lesson 99 - Continuous Random Variables HL Math - Santowski.
Main topics in the course on probability theory
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Independence of random variables
Chapter 2. Random Variables
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Presentation transcript:

Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in another random variable Y that is a function of X, that is, Y = g(X). The question is whether we can establish the probability distribution G(y) of Y.

Let g(x) be increasing on R(X). Since F(x) is the probability distribution of X and G(y) of Y, we can write and g(x) is increasing and so it has an inverse This means that

If g(x) is decreasing on R(X), we can proceed in a similar way, but since the inverse of a decreasing function is again decreasing, the inequality will revert the relation sign and so so that

Random variable X has the standardized normal distribution, calculate the distribution of the random variable Y = X 2. Sometimes we can use even more sophisticated methods as shown in the following example.

We cannot calculate the last integral exactly, but we can estab- lish the probability density of Y by differentiating G(y)

This is the probability density of the random variable X 2. We have, in fact, calculated the density of the chi-squared distribution with one degree of freedom.

Example Random variable X has a uniform distribution on [0,1]. Find a transformation g(X) such that Y = g(X) has a distribution F(y). X has the density and the distribution 1 1 x F(x)F(x)

F(x) is a distribution and as such has R as the domain and [0,1] as the range. This means that, if F(x) is increasing, it has an inverse F -1 (x) that is also increasing with range R and domain [0,1]. Consider the transformation Y = F -1 (x). We have Thus, we have shown that F(y) is the distribution of Y = F -1 (x). This can be used for example for simulating a distribution using a pocket calculator.

Random vector Height: 115 cm Weight: 17 kg No of children: 0 Employed: No Height: 195 cm Weight: 98 kg No of children: 4 Employed: Yes Height: 170 cm Weight: 80 kg No of children: 2 Employed: No POPULATION Persons chosen at random

The sequence of random variables X 1 = Height X 2 = Weight X 3 = Number of children X 4 = Employed is an example of a random vector. Generally, a random vector assigns a vector of real numbers to each outcome n times

Given a probabilistic space, a mapping is called a random vector if for every

Probability distribution of a random vector Let us consider a probabilistic space for a random vector we define its probability distribution as follows

Properties of the distribution of a random vector The probability distribution of a random vector has the following properties is increasing and continuous on the left in each of its independent variables

Discrete random vectors A random vectoris called discrete if its range is a finite or a countable set of real vectors For a discrete random vector, we can define the probability function

The relationship between a probability distribution and probability function

Continuous random vectors A random vectoris called continuous if its range includes a Cartesian product of n intervals If a functionexists such that we say thatis the probability density of the random vector

Marginal distributions For a random vector with a distribution we define marginal distributions is a limit ofwith all variables except x i tending to infinity

If a random vector is discrete, we define its marginal probability functions where the summation is done over all the values of the variables

If a random vector is continuous, we define its marginal probability densities

By considering a marginal distribution of a random vector we actually define a single random variable X i by "neglecting" all other variables.

Example for n = 2 Let (X,Y) be a discrete random variable with X taking on values from the set {1,2,3,4}, Y from the set {-1,1,3,5,7} and the probability function given by the below table. Calculate the marginal probability functions of the random variables X and Y.

Let be a random vector with a distribution If we say that and marginal distributions are independent random variables. Ifare independent and have a probability function p or density f, it can be proved that also or

Correlation coefficient of two random variables Let us consider a random vector (X,Y) with a distribution F(x,y). Using the marginal distributions F x (x,y) and F y (x,y) we can define the expectancies E(X) and E(Y) and variances D(X) and D(Y). We define the covariance of the random vector (X,Y) cov(X,Y) = E(XY) – E(X)E(Y) or depending on whether (X,Y) is discrete or continuous. Then is the probability function or density. with

The correlation coefficient is then defined as has the following properties

Example Calculate the correlation coefficient of the discrete random vector (X,Y) with a probability function given by the below table

Note that, generally, it is not true that implies that X and Y are independent as proved by the following example The correlation coefficient is zero as can be easily calculated, but we have, for example, so that X and Y cannot be independent.