K. Desch – Statistical methods of data analysis SS10 2. Probability 2.3 Joint p.d.f.´s of several random variables Examples: Experiment yields several.

Slides:



Advertisements
Similar presentations
Random Variables ECE460 Spring, 2012.
Advertisements

HUDM4122 Probability and Statistical Inference February 2, 2015.
Review of Basic Probability and Statistics
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
Probability Densities
G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1 Statistical Data Analysis: Lecture 2 1Probability, Bayes’ theorem 2Random variables and.
Class notes for ISE 201 San Jose State University
QA-2 FRM-GARP Sep-2001 Zvi Wiener Quantitative Analysis 2.
FRM Zvi Wiener Following P. Jorion, Financial Risk Manager Handbook Financial Risk Management.
Chapter 6 Continuous Random Variables and Probability Distributions
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
1 Variance of RVs Supplementary Notes Prepared by Raymond Wong Presented by Raymond Wong.
K. Desch – Statistical methods of data analysis SS10
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection.
K. Desch – Statistical methods of data analysis SS10 3. Distributions 3.1 Binomial distribution Error of an efficiency Estimator for the efficiency (when.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Continuous Random Variables and Probability Distributions
Review of Probability and Statistics
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Random Variable and Probability Distribution
Lecture II-2: Probability Review
1 Random Variables and Discrete probability Distributions SESSION 2.
Modern Navigation Thomas Herring
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
1 Probability and Statistics  What is probability?  What is statistics?
פרקים נבחרים בפיסיקת החלקיקים אבנר סופר אביב
880.P20 Winter 2006 Richard Kass Binomial Probability Distribution For the binomial distribution P is the probability of m successes out of N trials. Here.
G. Cowan Computing and Statistical Data Analysis / Stat 2 1 Computing and Statistical Data Analysis Stat 2: Catalogue of pdfs London Postgraduate Lectures.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
5-1 Random Variables and Probability Distributions The Binomial Distribution.
Lecture 8. Random variables Random variables and probability distributions Discrete random variables (Continuous random variables)
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
EE 5345 Multiple Random Variables
Multivariate distributions Suppose we are measuring 2 or more different properties of a system –e.g. rotational and radial velocities of stars in a cluster.
G. Cowan Lectures on Statistical Data Analysis Lecture 4 page 1 Statistical Data Analysis: Lecture 4 1Probability, Bayes’ theorem 2Random variables and.
The Binomial Distribution.  If a coin is tossed 4 times the possibilities of combinations are  HHHH  HHHT, HHTH, HTHH, THHHH  HHTT,HTHT, HTTH, THHT,
5-2 Probability Models The Binomial Distribution and Probability Model.
Section 7.1 Discrete and Continuous Random Variables
Continuous Random Variables and Probability Distributions
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
AP Statistics, Section 7.11 The Practice of Statistics Third Edition Chapter 7: Random Variables 7.1 Discete and Continuous Random Variables Copyright.
Unit 4 Review. Starter Write the characteristics of the binomial setting. What is the difference between the binomial setting and the geometric setting?
Biostatistics Class 3 Probability Distributions 2/15/2000.
AP Statistics Section 7.1 Probability Distributions.
Section Discrete and Continuous Random Variables AP Statistics.
“Teach A Level Maths” Statistics 1
Random Variables/ Probability Models
“Teach A Level Maths” Statistics 1
Appendix A: Probability Theory
UNIT 8 Discrete Probability Distributions
Binomial Distribution
HUDM4122 Probability and Statistical Inference
Discrete Random Variables 2
Econometric Models The most basic econometric model consists of a relationship between two variables which is disturbed by a random error. We need to use.
Chapter 4 STAT 315 Nutan S. Mishra.
Data Analysis and Statistical Software I ( ) Quarter: Autumn 02/03
Section Probability Models
“Teach A Level Maths” Statistics 1
Section 7.1 Discrete and Continuous Random Variables
Pascal’s Arithmetic Triangle
Section 7.1 Discrete and Continuous Random Variables
72 24) 20/ ) S = {hhh, hht, hth, thh, tth, tht, htt, ttt} 10%
Sample Spaces and Probability
Presentation transcript:

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.3 Joint p.d.f.´s of several random variables Examples: Experiment yields several simultaneous measurements (e.g. temperature and pressure) Joint p.d.f. (here only for 2 variables): f(x,y) dx dy = probability, that x  [x,x+dx] and y  [y,y+dy] Normalization: Individual probability distribution (“marginal p.d.f.”) for x and y: yields probability density for x (or y) independent of y (or x) x and y are statistically independent if for any y (and vice versa)

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.3 Joint p.d.f.´s of several random variables conditional p.d.f.´s: h(y|x)dxdy is the probability for event to lie in the interval [y,y+dy] when the event is known to lie in the interval [x,x+dx].

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.3 Joint p.d.f.´s of several random variables Example: measurement of the length of a bar and its temperature x = deviation from 800 mm y = temperature in 0 C a)2-dimentional histogram (“scatter-plot”) b)Marginal distribution of y (“y-projection”) c)Marginal distribution of x (“x-projection”) d)2 conditional distributions of x (s. edges in (a)) Width in d) smaller than in c) x and y are “correlated”

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.3 Joint p.d.f.´s of several random variables Expectation value (analog to 1-dim. case) Variance (analog to 1-dim. case) Covariance for 2 variables x, y with joint p.d.f. f(x,y): important when more than one variable: measure for the correlation of the variables: if x, y are stat. independent (f(x,y) = f x (x)f y (y)) then cov[x,y] = 0 (but not vice versa!!)

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.3 Joint p.d.f.´s of several random variables Positive correlation: positive (negative) deviation of x from its  x increases the probability, that y has a positive (negative) deiation of its  y For the sum of random numbers x+y holds: V[x+y] = V[x] + V[y] + 2 cov[x,y] (proof: linearity of E[]) For n random variables x i i=1,n: is the covariance matrix (symmetric matrix) diagonal elements: For uncorrelated variables: covariance matrix is diagonal For all elements: Normalized quantity: is the correlation coefficient

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.3 Joint p.d.f.´s of several random variables examples for correlation coefficients (Axis units play no role !)

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.3 Joint p.d.f.´s of several random variables one more example: [Barlow]

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.3 Joint p.d.f.´s of several random variables another example:

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.4 Transformation of variables Measured quantity: x (distributed according to pdf f(x)) Derived quanitity: y = a(x)  What is the p.d.f. of y, g(y) ? Define g(y) by requiring the same probability for y x [y,y+dy] [x,x+dx] =: dS a(x)

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.4 Transformation of variables More tedious when x  y not a 1  1 relation, e.g. y [y,y+dy] two branches x>0 and x<0 for g(y) sum up the probabilities for x>0 and x<0

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.Transformation of variables Functions of more variables : transformation through the Jacobian matrix:

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.4 Transformation of variables Example: Gaussian momentum distribution Momentum in x and y: polar coordinates x = r cos φ y = r sin φ r 2 := x 2 + y 2 det J = r → g (r,φ) = f ( x (r,φ), y (r,φ) ) det J = In 3-dimenions → Maxwell distribution

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.5 Error propagation Often, one is not interested in complete transformation of p.d.f. but only in the transformation of its variance (=squared error) measured error of x  derived error of y When σ x is small relative to curvature of y(x) : → linear approach What about the variance?

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.5 Error propagation Variance : →

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.5 Error propagation For more variables y i : → general formula for error propagation (in linear approximation) Special cases: a) uncorrelated x j : and  even if x i are uncorrelated, the y i are in general correlated

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.5 Error propagation b) Sum y = x 1 + x 2 → errors added in quadratures c) Product y = x 1 x 2 → relative errors added in quadratures x 1 and x 2 are uncorrelated !

K. Desch – Statistical methods of data analysis SS10 2. Probability 2.5 Convolution Convolution : Typical case when a probability distribution consists of two random variables x, y like a sum w = x + y. w is also a random variable Example: x: Breit-Wigner Resonance y: Exp. Resolution (Gauss) What is the p.d.f. for w when f x (x) and f y (y) are known x y

3. Distributions Important probability distributions - Binominal distribution - Poisson distribution - Gaussian distribution - Cauchy (Breit-Wigner) distribution - Chi-squared distribution - Landau distribution - Uniform distribution Central limit theorem

3. Distributions 3.1 Binomial distribution Binomial distribution appears when one has exactly two possible trial outcomes (success-failure, head-tail, even-odd, …) event “success”: event “failure”: Probability: Example: (ideal) coins Probability for “head” (A) = p = 0.5, q=0.5 Probability for n=4 trials to get k-time “head” (A) ? k=0: P = (1-p) 4 = 1/16 k=1: P = (p (1-p) 3 ) times number of combinations (HTTT, THTT, TTHT, TTTH) = 4*1/16 = ¼ k=2: P = (p 2 (1-p) 2 ) times (HHTT, HTTH, TTHH, HTHT, THTH, THHT) = 6*1/16 = 3/8 k=3: P = (p 3 (1-p)) times (HHHT, HHTH, HTHH, THHH) = 4*1/15 = ¼ k=4: P = p 4 = 1/16 P(0)+P(1)+P(2)+P(3)+P(4) = 1/16+1/4+3/8+1/4+1/16 = 1 ok

3. Distributions 3.1 Binomial distribution Number of permutations for k successes by n trials: Binominal coefficient: Binomial distribution: - Discrete probability distribution - Random variable: k - Depends on 2 parameters: n (number of attempts) and p (probability of suc.) - Sequence of appearance of k successes play no role - n trials must be independent

3. Distributions 3.1 Binomial distribution (properties) Normalisation: Expectation value (mean value): Proof:

3. Distributions 3.1 Binomial distribution (properties) Variance: Proof: However:

3. Distributions 3.1 Binomial distribution

HERA-B experiment muon spectrometer 12 chambers; efficiency of one chamber is ε = 95% Trigger condition: 11 out of 12 chambers hit ε TOTAL = P(11; 12,0.95) + P(12; 12,0.95) = 88.2 % When chambers reach only ε = 90% then ε TOTAL = 65.9% When one chambers fails: ε TOTAL = P(11, 0.95, 12) = 56.9 % Random coincidences (noise): ε BG = 10% 20% - twice more noise ε TOTAL_BG =  200x more background x x x x x x x x x x x μ

3. Distributions 3.1 Binomial distribution Example: number of error bars in 1  -interval (p=0.68)