Download presentation
Presentation is loading. Please wait.
1
Linear Regression Using a linear function to interpolate the training set The most popular criterion: Least squares approach Given the training set: Find a linear function: where is determined by solving the minimization problem: The function is called the square loss function
2
Linear Regression (Cont.) Different measures of loss are possible 1-norm loss function -insensitive loss function Huber’s regression Ridge regression where
3
Solution of the Least Squares Problem Some notations: We are going to find the with the samllest square loss. i.e.,
4
The Widrow-Hoff Algorithm (Primal Form) Repeat: Until convergence criterion satisfied return: Given a training set and learning rate Initial: Minimize the square loss function using gradient descent Dual form exists (i.e. )
5
The Normal Equations of LSQ Letting we have the normal equations of LSQ: If is inversable then Note: The above result is based on the First Order Optimality Conditions (necessary & sufficient for differentiable convex minimization problems) is singular ? What if
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.