Chapter 10 Real Inner Products and Least-Square (cont.)

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Ordinary Least-Squares
Linear Inverse Problems
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Engineering experiments involve the measuring of the dependent variable as the independent one has been altered, so as to determine the relationship between.
EGR 105 Foundations of Engineering I Fall 2007 – week 7 Excel part 3 - regression.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Orthogonality and Least Squares
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
Chapter 2 – Simple Linear Regression - How. Here is a perfect scenario of what we want reality to look like for simple linear regression. Our two variables.
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Section 8.3 – Systems of Linear Equations - Determinants Using Determinants to Solve Systems of Equations A determinant is a value that is obtained from.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Linear Regression.
Chap. 1 Systems of Linear Equations
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-3 Regression.
Systems and Matrices (Chapter5)
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 5: The Orthogonality and Least Squares
Chapter 13 Statistics © 2008 Pearson Addison-Wesley. All rights reserved.
Probabilistic and Statistical Techniques 1 Lecture 24 Eng. Ismail Zakaria El Daour 2010.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
© 2008 Pearson Addison-Wesley. All rights reserved Chapter 1 Section 13-6 Regression and Correlation.
Section 4.2 Least Squares Regression. Finding Linear Equation that Relates x and y values together Based on Two Points (Algebra) 1.Pick two data points.
AN ORTHOGONAL PROJECTION
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
Orthogonality and Least Squares
Curve-Fitting Regression
Chapter 7 Inner Product Spaces 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 4 Section 2 – Slide 1 of 20 Chapter 4 Section 2 Least-Squares Regression.
Regression Regression relationship = trend + scatter
Elementary Linear Algebra Anton & Rorres, 9th Edition
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 4. Least squares.
Chapter 10 Real Inner Products and Least-Square
Geology 5670/6670 Inverse Theory 21 Jan 2015 © A.R. Lowry 2015 Read for Fri 23 Jan: Menke Ch 3 (39-68) Last time: Ordinary Least Squares Inversion Ordinary.
Copyright © Cengage Learning. All rights reserved. 1 STRAIGHT LINES AND LINEAR FUNCTIONS.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Chapter 3- Model Fitting. Three Tasks When Analyzing Data: 1.Fit a model type to the data. 2.Choose the most appropriate model from the ones that have.
Copyright © 2011 Pearson Education, Inc. Systems of Linear Equations in Two Variables Section 5.1 Systems of Equations and Inequalities.
Slide Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
Chapter 4: Systems of Equations and Inequalities Section 4.3: Solving Linear Systems Using Graphs.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Least Squares Problems From Wikipedia, the free encyclopedia The method of least squares is a standard approach to the approximate solution of overdetermined.
Computacion Inteligente Least-Square Methods for System Identification.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Section 8 Numerical Analysis CSCI E-24 José Luis Ramírez Herrán October 20, 2015.
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Part 5 - Chapter
Least Squares Approximations
LESSON 21: REGRESSION ANALYSIS
1.1 Introduction to Systems of Equations.
Orthogonality and Least Squares
Discrete Least Squares Approximation
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
3 Chapter Chapter 2 Graphing.
Orthogonality and Least Squares
Presentation transcript:

Chapter 10 Real Inner Products and Least-Square (cont.) In this handout: Section 10.5: Least-Squares

A common problem in business, science, engineering is to collect data and analyze them to predict future events. If such data are plotted, they constitute a scatter diagram which may provide useful insight into the underlying relationship between system variables. The data below appears to follow a straight line relationship. The problem is to determine the equation of the straight line that best fits the data. x y

Consider an arbitrary straight line, y = b0 + b1 x , to be fitted through these data points. For each data point, the error is the difference between the y-value of the point and the y-value obtained from the straight line approximation.

The least-squares straight line Definition 1: The least-squares error E is the sum of the squares of the individual errors. That is, Definition 2: The least-squares straight line is the line that minimizes the least-squares error. We want to find the equation of the least-squares straight line: y = mx + c We seek values of m and c that minimizes the least-squares error.

The normal equations For each of N data points, the error is We want the values for m and c that minimize This occurs when Or, upon simplifying, when The last two equations are the normal equations for a least-squares fit in two variables. Examples on the board.

Matrix representation of the normal equations Ideally, we would like to choose m and c so that yi = mxi + c for all data pairs (xi, yi), i=1,2,…,N. That is, we want the values for m and c that solve the system or, equivalently, the matrix equation

Matrix representation of the normal equations This system has the standard form Ax=b where x = [m c], b = [y1 y2 … yN], and A has two columns [x1 x2 … xN] and [1 1 … 1]. Ax=b has a solution if and only if the data falls on a straight line. If not, then the system is inconsistent, and we seek a solution that minimizes the least-square error: The least-square solution which is given by the normal equations has the following matrix form

The least-squares solution for any linear system The concept of linear-squares can be generalized to any linear system Ax=b. We are primarily interested in cases where the system is inconsistent. This generally occurs when A has more rows than columns. Measurement errors are inevitable in observational and experimental sciences. Errors can be smoothed out by averaging over many cases, i.e., taking more measurements than are strictly necessary to determine parameters of system. Resulting system is overdetermined (more rows than columns), so usually there is no exact solution. The least-squares is an approximate solution to this kind of systems.

The least-squares solution for any linear system We seek the vector that minimizes the least-squares error defined by Theorem 1: If x has the property that Ax-b is orthogonal to the columns of A, then x minimizes the least-squares error. As a consequence to Theorem 1, x is the least-squares solution to Ax=b if and only if x is the solution to This set of normal equations is guaranteed to have a unique solution whenever the columns of A are linearly independent. The solution can be found using the techniques of previous chapters.

The least-squares solution of a linear system (example) Find the least squares solution of the linear system Ax = b given by x1 – x2 = 4 3x1 + 2x2 = 1 -2x1 + 4x2 = 3 Solution:

The least-squares solution of a linear system (example) Solution (cont.): We have so the normal system ATAx = ATb in this case is Solving this system yields the least squares solution x1 = 17/95, x2 = 143/285