Least Squares Estimate Additional Notes 1. Introduction The quality of an estimate can be judged using the expected value and the covariance matrix of.

Slides:



Advertisements
Similar presentations
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
Advertisements

Evaluating Classifiers
Kriging.
CHAPTER 2 Building Empirical Model. Basic Statistical Concepts Consider this situation: The tension bond strength of portland cement mortar is an important.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Inferential Statistics
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Ka-fu Wong © 2003 Chap 8- 1 Dr. Ka-fu Wong ECON1003 Analysis of Economic Data.
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
The Simple Linear Regression Model: Specification and Estimation
Evaluating Hypotheses
Typical Sample Sizes Incidence Rate Calculation.
1 BA 555 Practical Business Analysis Review of Statistics Confidence Interval Estimation Hypothesis Testing Linear Regression Analysis Introduction Case.
Lecture 19 Simple linear regression (Review, 18.5, 18.8)
1 Confidence Intervals for Means. 2 When the sample size n< 30 case1-1. the underlying distribution is normal with known variance case1-2. the underlying.
Standard error of estimate & Confidence interval.
Correlation and Linear Regression
Review of normal distribution. Exercise Solution.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University ECON 4550 Econometrics Memorial University of Newfoundland.
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
Geology 5670/6670 Inverse Theory 26 Jan 2015 © A.R. Lowry 2015 Read for Wed 28 Jan: Menke Ch 4 (69-88) Last time: Ordinary Least Squares (   Statistics)
GEO7600 Inverse Theory 09 Sep 2008 Inverse Theory: Goals are to (1) Solve for parameters from observational data; (2) Know something about the range of.
Chapter 13 Statistics © 2008 Pearson Addison-Wesley. All rights reserved.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Today’s lesson Confidence intervals for the expected value of a random variable. Determining the sample size needed to have a specified probability of.
AP Statistics Chapter 9 Notes.
1 Introduction to Estimation Chapter Concepts of Estimation The objective of estimation is to determine the value of a population parameter on the.
© 1998, Geoff Kuenning General 2 k Factorial Designs Used to explain the effects of k factors, each with two alternatives or levels 2 2 factorial designs.
Section 8.1 Estimating  When  is Known In this section, we develop techniques for estimating the population mean μ using sample data. We assume that.
Chapter 12 Multiple Linear Regression Doing it with more variables! More is better. Chapter 12A.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
1 Estimation From Sample Data Chapter 08. Chapter 8 - Learning Objectives Explain the difference between a point and an interval estimate. Construct and.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Statistical estimation, confidence intervals
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 12 Inference About A Population.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
Determination of Sample Size: A Review of Statistical Theory
Estimation Chapter 8. Estimating µ When σ Is Known.
Estimation: Confidence Intervals Based in part on Chapter 6 General Business 704.
Section 10.1 Confidence Intervals
CHAPTER SEVEN ESTIMATION. 7.1 A Point Estimate: A point estimate of some population parameter is a single value of a statistic (parameter space). For.
Confidence Intervals (Dr. Monticino). Assignment Sheet  Read Chapter 21  Assignment # 14 (Due Monday May 2 nd )  Chapter 21 Exercise Set A: 1,2,3,7.
ME Mechanical and Thermal Systems Lab Fall 2011 Chapter 3: Assessing and Presenting Experimental Data Professor: Sam Kassegne, PhD, PE.
Geology 5670/6670 Inverse Theory 21 Jan 2015 © A.R. Lowry 2015 Read for Fri 23 Jan: Menke Ch 3 (39-68) Last time: Ordinary Least Squares Inversion Ordinary.
Confidence Interval Estimation For statistical inference in decision making:
AP STATISTICS LESSON 14 – 1 ( DAY 1 ) INFERENCE ABOUT THE MODEL.
Copyright © 2014 Pearson Education. All rights reserved Estimating Population Means LEARNING GOAL Learn to estimate population means and compute.
Chapter 10 The t Test for Two Independent Samples
Inen 460 Lecture 2. Estimation (ch. 6,7) and Hypothesis Testing (ch.8) Two Important Aspects of Statistical Inference Point Estimation – Estimate an unknown.
Chapter 8: Simple Linear Regression Yang Zhenlin.
Review of Statistics.  Estimation of the Population Mean  Hypothesis Testing  Confidence Intervals  Comparing Means from Different Populations  Scatterplots.
One Sample Mean Inference (Chapter 5)
Confidence Interval Estimation For statistical inference in decision making: Chapter 9.
Copyright © 2009 Pearson Education, Inc. 8.1 Sampling Distributions LEARNING GOAL Understand the fundamental ideas of sampling distributions and how the.
ES 07 These slides can be found at optimized for Windows)
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
Stats Term Test 4 Solutions. c) d) An alternative solution is to use the probability mass function and.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Statistical Significance Hypothesis Testing.
10.1 – Estimating with Confidence. Recall: The Law of Large Numbers says the sample mean from a large SRS will be close to the unknown population mean.
Module 25: Confidence Intervals and Hypothesis Tests for Variances for One Sample This module discusses confidence intervals and hypothesis tests.
The accuracy of averages We learned how to make inference from the sample to the population: Counting the percentages. Here we begin to learn how to make.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate its.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Statistical Interpretation of Least Squares ASEN.
(5) Notes on the Least Squares Estimate
Statistical Quality Control, 7th Edition by Douglas C. Montgomery.
Confidence Intervals and Hypothesis Tests for Variances for One Sample
Unfolding Problem: A Machine Learning Approach
The regression model in matrix form
Probabilistic Surrogate Models
How Confident Are You?.
Presentation transcript:

Least Squares Estimate Additional Notes 1

Introduction The quality of an estimate can be judged using the expected value and the covariance matrix of the estimated parameters. The following theorem is an important tool to determine the quality of the least squares estimate. 2

Assumptions Assume that there exists a true parameter vector θ 0 so that the data satisfy: The stochastic process e(k) is zero-mean white noise, with variance σ 2 at every step. Intuition: The assumption says that the true system can be represented by the chosen model, up to some errors that are well-behaved in a statistical sense. 3

Theorem 4

Remark on Part 1 Part 1 of the theorem says that the solution makes (statistical) sense on average. 5

Remarks on Part 2 6

Remarks on Part 2, continued 7

Confidence intervals for the parameters If the residuals e are normally distributed, the confidence interval of the parameters can be computed. The confidence limits give a rough idea of the significance of the estimates. For example, when a parameter that is expected to lie between 0 and 1 turns out to be 0.4 with 95% confidence region of ±0.005, it is safe to say that it is a good estimate of that parameter. However, when the 95% confidence region around the parameter is ±2, the estimate is meaningless. 8

Confidence intervals for the parameters For any Gaussian distribution N(µ, σ 2 ), the probability of drawing a sample in the interval [µ−1.96σ, µ+1.96σ] is found (from the Gaussian formula) to be

Confidence intervals for the parameters 10

Example Consider the following model and the input-output data: Use the least squares method to find the unknown parameters a and b along with their confidence intervals. k u(k)u(k) * y(k)y(k)

Solution 12

We calculate the least squares estimate of a and b as Hence, the model of least squares fit is 13

Solution, continued The residual errors e can be calculated as: The number of data points is N = 8, and the number of unknown parameters is n = 2. Hence, an estimate of the noise variance is 14

Solution, continued The covariance matrix of the estimated parameters is The relatively large magnitudes of the off- diagonal terms in P indicate that there is a strong interaction between the estimates. This is typical in dynamic systems. 15

The 95% confidence limits are roughly given by The confidence regions are so small that the estimates can be accepted without much reservation. Solution, continued 16

Visualization of the confidence interval 17 To visualize the confidence intervals, imagine that we repeat the previous experiment 1000 times. In each experiment, we use 8 input-output data points and the least squares to find an estimate of the parameters. The empirical distribution (histogram) of the estimated parameters are shown in the next slide.

Visualization of the confidence interval 18 The confidence interval describes roughly the width of the histogram.

Acknowledgement Most of the material in this course are due to: (1)Prof. Abdel-Latif El-Shafei, Department of Electrical power Engineering, Cairo University. (2)Dr Lucian Busoniu, Technical University of Cluj-Napoca, Romania. (3)Prof. Xia Hong, University of Reading, England