Handout Ch 4 實習.

Slides:



Advertisements
Similar presentations
NORMAL OR GAUSSIAN DISTRIBUTION Chapter 5. General Normal Distribution Two parameter distribution with a pdf given by:
Advertisements

Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Independence of random variables
Probability Densities
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Review of Probability and Statistics
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
Andy Guo 1 Handout Ch5(2) 實習. Andy Guo 2 Normal Distribution There are three reasons why normal distribution is important –Mathematical properties of.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Random Variables and Discrete probability Distributions
Review of Probability Concepts ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Continuous Distributions The Uniform distribution from a to b.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Statistics for Business & Economics
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Chapter 5 Expectations 主講人 : 虞台文. Content Introduction Expectation of a Function of a Random Variable Expectation of Functions of Multiple Random Variables.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Continuous Random Variables and Probability Distributions
Functions of Random Variables
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Ch 7 實習. Jia-Ying Chen2 Random Variables and Probability Distributions A random variable is a function or rule that assigns a numerical value to each.
Ch 7 & 8 實習. Jia-Ying Chen2 Random Variables and Probability Distributions A random variable is a function or rule that assigns a numerical value to each.
Random Variables By: 1.
Continuous Distributions
Sampling and Sampling Distributions
Jiaping Wang Department of Mathematical Science 04/22/2013, Monday
Inequalities, Covariance, examples
Handout Ch5(1) 實習.
Lecture 3 B Maysaa ELmahi.
Applied Discrete Mathematics Week 11: Relations
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Lectures prepared by: Elchanan Mossel Yelena Shvets
Parameter, Statistic and Random Samples
Linear Combination of Two Random Variables
3.1 Expectation Expectation Example
Chapter 4: Mathematical Expectation:
Some Rules for Expectation
Review of Probability Concepts
Chapter 10: Covariance and Correlation
Multinomial Distribution
Independence of random variables
Handout Ch 4 實習.
Handout Ch 4 實習.
Chapter 5 Expectations 主講人:虞台文.
Chapter 5 Applied Statistics and Probability for Engineers
Chapter 2. Random Variables
Handout Ch 5.
Further Topics on Random Variables: Covariance and Correlation
Continuous Distributions
Chapter 10: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Chapter 10: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

Handout Ch 4 實習

微積分複習第二波(1) Example Jia-Ying Chen

微積分複習第二波(2) 變數變換 Example 這是什麼鬼 Jia-Ying Chen

微積分複習第二波(3) Ch 4積分補充 Jia-Ying Chen

歸去來析 ( 乾脆去死,台語) Jia-Ying Chen

Expectation of a Random Variable Discrete distribution Continuous distribution E(X) is called expected value, mean or expectation of X. E(X) can be regarded as being the center of gravity of that distribution. E(X) exists if and only if E(X) exists if and only if Whenever X is a bounded random variable, then E(X) must exist. Jia-Ying Chen

The Expectation of a Function Let , then Let , then Suppose X has p.d.f as follows: Let it can be shown that Jia-Ying Chen

Example 1 (4.1.3) In a class of 50 students, the number of students ni of each age i is shown in the following table: If a student is to be selected at random from the class, what is the expected value of his age Agei 18 19 20 21 25 ni 22 4 3 1 Jia-Ying Chen

Solution E[X]=18*0.4+19*0.44+20*0.08+21*0.06+ 25*0.02=18.92 Agei 18 19 ni 22 4 3 1 Pi 0.4 0.44 0.08 0.06 0.02 Jia-Ying Chen

Properties of Expectations If there exists a constant such that If are n random variables such that each exists, then For all constants Usually Only linear functions g satisfy If are n independent random variable such that each exists, then Jia-Ying Chen

Example 2 (4.2.7) Suppose that on each play of a certain game a gambler is equally likely to win or to lose. Suppose that when he wins, his fortune is doubled; and when he loses, his fortune is cut in half. If he begins playing with a given fortune c, what is the expected value of his fortune after n independent plays of the game? Jia-Ying Chen

Solution Jia-Ying Chen

Properties of the Variance Var(X ) = 0 if and only if there exists a constant c such that Pr(X = c) = 1. For constant a and b, . Proof : Jia-Ying Chen

Properties of the Variance If X1 , …, Xn are independent random variables, then If X1,…, Xn are independent random variables, then Jia-Ying Chen

Example 3 (4.3.4) Suppose that X is a random variable for which E(X)=μ and Var(X)=σ2. Show that Jia-Ying Chen

Solution Jia-Ying Chen

Moment Generating Functions Consider a given random variable X and for each real number t, we shall let . The function is called the moment generating function (m.g.f.) of X. Suppose that the m.g.f. of X exists for all values of t in some open interval around t = 0. Then, More generally, Jia-Ying Chen

Properties of Moment Generating Functions Let X has m.g.f. ; let Y = aX+b has m.g.f. . Then for every value of t such that exists, Proof: Suppose that X1,…, Xn are n independent random variables; and for i = 1,…, n, let denote the m.g.f. of Xi. Let , and let the m.g.f. of Y be denoted by . Then for every value of t such that exists, we have Jia-Ying Chen

The m.g.f. for the Binomial Distribution Suppose that a random variable X has a binomial distribution with parameters n and p. We can represent X as the sum of n independent random variables X1,…, Xn. Determine the m.g.f. of Jia-Ying Chen

Uniqueness of Moment Generating Functions If the m.g.f. of two random variables X1 and X2 are identical for all values of t in an open interval around t = 0, then the probability distributions of X1 and X2 must be identical. The additive property of the binomial distribution Suppose X1 and X2 are independent random variables. They have binomial distributions with parameters n1 and p and n2 and p. Let the m.g.f. of X1 + X2 be denoted by . The distribution of X1 + X2 must be binomial distribution with parameters n1 + n2 and p. Jia-Ying Chen

Example 4 (4.4.8) Suppose that X is a random variable for which the m.g.f. is as follows: Find the mean and the variance of X Jia-Ying Chen

Solution Jia-Ying Chen

Properties of Variance and Covariance If X and Y are random variables such that and , then Correlation only measures linear relationship. Two random variables can be dependent, but uncorrelated. Example: Suppose that X can take only three values –1, 0, and 1, and that each of these three values has the same probability. Let Y=X 2. So X and Y are dependent. E(XY)=E(X 3)=E(X)=0, so Cov(X,Y) = E(XY) – E(X)E(Y)=0 (uncorrelated). Jia-Ying Chen

Example 5 (4.6.11) Suppose that two random variables X and Y cannot possibly have the following properties: E(X)=3, E(Y)=2, E(X2)=10. E(Y2)=29, and E(XY)=0 Jia-Ying Chen

Solution Jia-Ying Chen