Functions of random variables Sometimes what we can measure is not what we are interested in! Example: mass of binary-star system: We want M but can only.

Slides:



Advertisements
Similar presentations
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Advertisements

CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Chapter 7. Statistical Estimation and Sampling Distributions
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
The standard error of the sample mean and confidence intervals
Part 2b Parameter Estimation CSE717, FALL 2008 CUBS, Univ at Buffalo.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Evaluating Hypotheses
5. Estimation 5.3 Estimation of the mean K. Desch – Statistical methods of data analysis SS10 Is an efficient estimator for μ ?  depends on the distribution.
A) Transformation method (for continuous distributions) U(0,1) : uniform distribution f(x) : arbitrary distribution f(x) dx = U(0,1)(u) du When inverse.
Maximum likelihood (ML)
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
1 5. Combination of random variables Understand why we need bottoms-up approach for reliability analysis Learn how to compute the probability density function,
Review of Probability.
1 Machine Learning: Lecture 5 Experimental Evaluation of Learning Algorithms (Based on Chapter 5 of Mitchell T.., Machine Learning, 1997)
1  The goal is to estimate the error probability of the designed classification system  Error Counting Technique  Let classes  Let data points in class.
Estimation and Hypothesis Testing. The Investment Decision What would you like to know? What will be the return on my investment? Not possible PDF for.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Topic 5 Statistical inference: point and interval estimate
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
LECTURER PROF.Dr. DEMIR BAYKA AUTOMOTIVE ENGINEERING LABORATORY I.
1 G Lect 3b G Lecture 3b Why are means and variances so useful? Recap of random variables and expectations with examples Further consideration.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Summary Five numbers summary, percentiles, mean Box plot, modified box plot Robust statistic – mean, median, trimmed mean outlier Measures of variability.
Discrete distribution word problems –Probabilities: specific values, >, =, … –Means, variances Computing normal probabilities and “inverse” values: –Pr(X
Sample variance and sample error We learned recently how to determine the sample variance using the sample mean. How do we translate this to an unbiased.
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
Probability and Statistics
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Machine Learning Chapter 5. Evaluating Hypotheses
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Sundermeyer MAR 550 Spring Laboratory in Oceanography: Data and Methods MAR550, Spring 2013 Miles A. Sundermeyer Computing Basic Statistics.
Sampling Theory and Some Important Sampling Distributions.
Measurements and Their Analysis. Introduction Note that in this chapter, we are talking about multiple measurements of the same quantity Numerical analysis.
Surveying II. Lecture 1.. Types of errors There are several types of error that can occur, with different characteristics. Mistakes Such as miscounting.
Maximum likelihood estimators Example: Random data X i drawn from a Poisson distribution with unknown  We want to determine  For any assumed value of.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Which Class Did Better? Group 12A RESULTS(%) 0< x  2020< x  3030< x  5050< x  7070< x  100 FREQUENCY Group 12B RESULTS(%) 0< x  2020< x  3030
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Random Variables By: 1.
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Chapter 6: Sampling Distributions
Linear Algebra Review.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 8: Introduction to Statistics CIS Computational Probability.
STATISTICAL INFERENCE
12. Principles of Parameter Estimation
The Maximum Likelihood Method
The Maximum Likelihood Method
Some Rules for Expectation
Random Sampling Population Random sample: Statistics Point estimate
Evaluating Hypotheses
Figure 4-1 (p.104) The statistical model for defining abnormal behavior. The distribution of behavior scores for the entire population is divided into.
Outline Texture modeling - continued Julesz ensemble.
Review for Exam 1 Ch 1-5 Ch 1-3 Descriptive Statistics
6.3 Sampling Distributions
Chapter 2. Random Variables
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
Chapter 7 The Normal Distribution and Its Applications
Applied Statistics and Probability for Engineers
Presentation transcript:

Functions of random variables Sometimes what we can measure is not what we are interested in! Example: mass of binary-star system: We want M but can only measure V and P. Must conserve probability: Y X f(X) f(Y)

Non-linear transformations e.g.Flux distributions vs. wavelength, frequency: Fluxes and magnitudes: –Gaussian distribution: X ~ G(X 0,  2 ) –Nonlinear transformation induces a bias: –PROBLEM: evaluate a,  (M) in terms of X 0, . X M=-2.5 log X f(M) f(X)

Nonlinear transformations bias the mean To find, use Taylor expansion around X= : Hence X Y f(Y) f(X) Y=g(X) 0 This is the bias.

Variance of a transformed variable Get variance of Y from first principles: X Y f(Y) f(X) Y=g(X)

What is a statistic? Anything you measure or compute from the data. Any function of the data. Because the data “jiggle”, every satistic also “jiggles”. Example: the mean value of a sample of N data points is a statistic: It has a definite value for a particular dataset, but it also “jiggles” with the ensemble of datasets to trace out its own PDF. NB:

Sample mean and variance - 1 Sample mean: The distribution of sample means has a mean:...and a variance: if the X i are independent

Sample mean and variance - 2 If the X i are all drawn from a single parent distribution with mean and variance  2, then: And:

Other unbiased statistics Sample median (half points above, half below) (X max + X min ) / 2 Any single point X i chosen at random from sequence Weighted average:

Inverse variance weighting is best! Let’s evaluate the variance of the weighted average for some weighting function w i : The variance of the weighted average is minimised when: Let’s verify this -- it’s important!

Choosing the best weighting function To minimise the variance of the weighted average, set:

Using optimal weights Good principles for constructing statistics: –Unbiased -> no systematic error –Minimum variance -> smallest possible statistical error Optimally (inverse-variance) weighted average: Is unbiased, since: And has minimum variance: