Combining Noisy Measurements

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Christopher Dougherty EC220 - Introduction to econometrics (chapter 8) Slideshow: model b: properties of the regression coefficients Original citation:
The Simple Regression Model
Variance reduction techniques. 2 Introduction Simulation models should be coded such that they are efficient. Efficiency in terms of programming ensures.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 7) Slideshow: exercise 7.5 Original citation: Dougherty, C. (2012) EC220 - Introduction.
Model Assessment, Selection and Averaging
The adjustment of the observations
CMPUT 466/551 Principal Source: CMU
Observers and Kalman Filters
Logistic Regression Principal Component Analysis Sampling TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAA A A A.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
Error Propagation. Uncertainty Uncertainty reflects the knowledge that a measured value is related to the mean. Probable error is the range from the mean.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Environmentally Conscious Design & Manufacturing (ME592) Date: May 5, 2000 Slide:1 Environmentally Conscious Design & Manufacturing Class 25: Probability.
Statistical Background
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
1 MF-852 Financial Econometrics Lecture 6 Linear Regression I Roy J. Epstein Fall 2003.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
1 ECE310 – Lecture 23 Random Signal Analysis 04/27/01.
Introduction to the design (and analysis) of experiments James M. Curran Department of Statistics, University of Auckland
EC220 - Introduction to econometrics (review chapter)
1 UNBIASEDNESS AND EFFICIENCY Much of the analysis in this course will be concerned with three properties of estimators: unbiasedness, efficiency, and.
PATTERN RECOGNITION AND MACHINE LEARNING
沈致远. Test error(generalization error): the expected prediction error over an independent test sample Training error: the average loss over the training.
Error Analysis Accuracy Closeness to the true value Measurement Accuracy – determines the closeness of the measured value to the true value Instrument.
Noise & Uncertainty ASTR 3010 Lecture 7 Chapter 2.
Geo479/579: Geostatistics Ch12. Ordinary Kriging (1)
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
ANOVA Assumptions 1.Normality (sampling distribution of the mean) 2.Homogeneity of Variance 3.Independence of Observations - reason for random assignment.
Physics 114: Lecture 14 Mean of Means Dale E. Gary NJIT Physics Department.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
CHAPTER 5 SIGNAL SPACE ANALYSIS
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 We will now look at the properties of the OLS regression estimators with the assumptions of Model B. We will do this within the context of the simple.
ECE-7000: Nonlinear Dynamical Systems Overfitting and model costs Overfitting  The more free parameters a model has, the better it can be adapted.
Term 4, 2006BIO656--Multilevel Models 1 PROJECTS ARE DUE By midnight, Friday, May 19 th Electronic submission only to Please.
CSC321: Lecture 7:Ways to prevent overfitting
Engineering Statistics ECIV 2305
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
1 ALLOCATION. 2 DETERMINING SAMPLE SIZE Problem: Want to estimate. How choose n to obtain a margin of error not larger than e? Solution: Solve the inequality.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Mean & Standard Deviation The mean value of a random variable X, denoted by  X, describes.
Least Squares Measurement model Weighted LSQ  Optimal estimates  Linear  Unbiased  Minimum variance.
Wiener Processes and Itô’s Lemma
Physics 114: Lecture 13 Probability Tests & Linear Fitting
PDF, Normal Distribution and Linear Regression
Using Sensor Data Effectively
Telecommunications Networking I
The Maximum Likelihood Method
Probability Continued Chapter 6
Discrete Probability Distributions
Review of Probability and Estimators Arun Das, Jason Rebello
Chapter 2 Minimum Variance Unbiased estimation
Lecture 10: Observers and Kalman Filters
Random Sampling Population Random sample: Statistics Point estimate
Counting Statistics and Error Prediction
Motion Models (cont) 2/16/2019.
Additional notes on random variables
Random WALK, BROWNIAN MOTION and SDEs
Unscented Kalman Filter
Additional notes on random variables
Bayes and Kalman Filter
Kalman Filtering COS 323.
Problem 1.
AP Statistics Chapter 16 Notes.
Introduction to the design (and analysis) of experiments
Design of Experiments CHM 585 Chapter 15.
Introduction to Econometrics, 5th edition
Presentation transcript:

Combining Noisy Measurements 11/4/2019

Combining Noisy Measurements suppose that you take a measurement x1 of some real-valued quantity (distance, velocity, etc.) your friend takes a second measurement x2 of the same quantity after comparing the measurements you find that what is the best estimate of the true value μ? 11/4/2019

Combining Noisy Measurements suppose that an appropriate noise model for the measurements is where is zero-mean Gaussian noise with variance because two different people are performing the measurements it might be reasonable to assume that x1 and x2 are independent 11/4/2019

Combining Noisy Measurements x = 5; x1 = x + randn(1, 1000); % noise variance = 1 x2 = x + randn(1, 1000); % noise variance = 1 mu2 = (x1 + x2) / 2; bins = 1:0.2:9; hist(x1, bins); hist(x2, bins); hist(mu2, bins); 11/4/2019

Combining Noisy Measurements var(x1) = 0.9979 var(x2) = 0.9972 11/4/2019

Combining Noisy Measurements var(mu2) = 0.4942 11/4/2019

Combining Noisy Measurements suppose the precision of your measurements is much worse than that of your friend consider the measurement noise model where is zero-mean Gaussian noise with variance 11/4/2019

Combining Noisy Measurements x = 7; x1 = x + 3 * randn(1, 1000); % noise variance = 3*3 = 9 x2 = x + randn(1, 1000); % noise variance = 1 mu2 = (x1 + x2) / 2; bins = -2:0.2:18; hist(x1, bins); hist(x2, bins); hist(mu2, bins); 11/4/2019

Combining Noisy Measurements var(x1) = 8.9166 var(x2) = 0.9530 11/4/2019

Combining Noisy Measurements var(mu2) = 2.4317 11/4/2019

Combining Noisy Measurements is the average the optimal estimate of the combined measurements? 11/4/2019

Combining Noisy Measurements instead of ordinary averaging, consider a weighted average where the variance of a random variable is defined as where E[X] is the expected value of X 11/4/2019

Expected Value informally, the expected value of a random variable X is the long-run average observed value of X formally defined as properties 11/4/2019

Variance of Weighted Average 11/4/2019

Variance of Weighted Average because x1 and x2 are independent and are also independent; thus finally 11/4/2019

Variance of Weighted Average because x1 and x2 are independent and are also independent; thus finally 11/4/2019

Variance of Weighted Average one way to choose the weighting values is to choose the weights such that the variance is minimized 11/4/2019

Minimum Variance Estimate thus, the minimum variance estimate is 11/4/2019

Combining Noisy Measurements x = 7; x1 = x + 3 * randn(1, 1000); % noise variance = 3*3 = 9 x2 = x + randn(1, 1000); % noise variance = 1 w = 9 / (9 + 1); mu2 = (1 – w) * x1 + w * x2; bins = -2:0.2:18; hist(x1, bins); hist(x2, bins); hist(mu2, bins); 11/4/2019

Minimum Variance Estimate mu2=0.5*x1 + 0.5*x2 mu2=0.1*x1 + 0.9*x2 var(mu2) = 2.4317 var(mu2) = 0.8925 11/4/2019