Multiple linear regression

Slides:



Advertisements
Similar presentations
Sampling plans for linear regression
Advertisements

Lesson 10: Linear Regression and Correlation
Regression “A new perspective on freedom” TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A AAA A A.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
12-1 Multiple Linear Regression Models Introduction Many applications of regression analysis involve situations in which there are more than.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Chapter 10 Curve Fitting and Regression Analysis
Ch11 Curve Fitting Dr. Deshi Ye
Least Squares example There are 3 mountains u,y,z that from one site have been measured as 2474 ft., 3882 ft., and 4834 ft.. But from u, y looks 1422 ft.
Chapter 14 General Linear Squares and Nonlinear Regression.
Solving Linear Systems (Numerical Recipes, Chap 2)
Systems of Linear Equations
LU method 1) factor (decompose) A into L and U 2) given b, determine d 3) using Ux=d and backsubstitution, solve for x Advantage: Once you have L and U,
Stat 200b. Chapter 8. Linear regression models.. n by 1, n by 2, 2 by 1, n by 1.
3D Geometry for Computer Graphics. 2 The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Curve-Fitting Regression
General Linear Least-Squares and Nonlinear Regression
Revision.
Ordinary least squares regression (OLS)
Newton's Method for Functions of Several Variables
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM.
RLR. Purpose of Regression Fit data to model Known model based on physics P* = exp[A - B/(T+C)] Antoine eq. Assumed correlation y = a + b*x1+c*x2 Use.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Non-Linear Simultaneous Equations
Variance and covariance Sums of squares General linear models.
Correlation & Regression
Computational Methods in Physics PHYS 3437 Dr Rob Thacker Dept of Astronomy & Physics (MM-301C)
Simple linear regression and correlation analysis
Multiple Linear Regression - Matrix Formulation Let x = (x 1, x 2, …, x n )′ be a n  1 column vector and let g(x) be a scalar function of x. Then, by.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM CISE301_Topic1.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
CISE301_Topic11 CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4:
Ordinary Least-Squares Emmanuel Iarussi Inria. Many graphics problems can be seen as finding the best set of parameters for a model, given some data Surface.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Chem Math 252 Chapter 5 Regression. Linear & Nonlinear Regression Linear regression –Linear in the parameters –Does not have to be linear in the.
1 General Linear Squares and Nonlinear Regression.
Curve-Fitting Regression
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Solve by using the ELIMINATION method The goal is to eliminate one of the variables by performing multiplication on the equations. Multiplication is not.
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
Chapter 6 Simple Regression Introduction Fundamental questions – Is there a relationship between two random variables and how strong is it? – Can.
Slide Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley A set of equations is called a system of equations. The solution.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Trees Example More than one variable. The residual plot suggests that the linear model is satisfactory. The R squared value seems quite low though,
Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Method of Least Squares Advanced Topic of Lecture on Astrometry.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Numerical Methods.  LU Decomposition is another method to solve a set of simultaneous linear equations  For most non-singular matrix [A] that one could.
MATHEMATICS B.A./B.Sc. (GENERAL) THIRD YEAR EXAMINATION, 2012.
Chapter: 3c System of Linear Equations
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Review of Linear Algebra
Part 5 - Chapter
Multiple Regression.
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Collaborative Filtering Matrix Factorization Approach
Today’s class Multiple Variable Linear Regression
Linear regression Fitting a straight line to observations.
Nonlinear regression.
General Linear Least-Squares and Nonlinear Regression
~ Least Squares example
Nonlinear Fitting.
~ Least Squares example
Engineering Analysis ENG 3420 Fall 2009
Multivariable Linear Systems
Regression and Correlation of Data
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM CISE301_Topic1.
Presentation transcript:

Multiple linear regression dependence on more than one variable e.g. dependence of runoff volume on soil type and land cover

With two independent variables, get a surface

Much like polynomial regression Sum of squared residuals

Rearrange to get Very much like normal equations for polynomial regression

Once again, solve by any matrix method Cholesky is appropriate - symmetric and positive definite

Example: Strength of concrete depends on cure time and cement/water ratio

Samples

Solve by Cholesky decomposition Backsubstitution

General least squares Given z are functions, e.g

Can express as and define Sr

As usual, take partials to minimize lead to matrix equations Solve this for [a] Cholesky LU or Gauss elimination Matrix inverse

Confidence intervals If we say the elements of are then

Use Excel to get t-distribution TINV(a,n-2)

Nonlinear regression Use Taylor series expansion to linearize original equation - Gauss Newton approach Let model be Where f is a nonlinear function of x are one of a set of n observations

Use Taylor series for f, and chop j - initial guess j+1 - improved guess

Plug the Taylor series into original equation

Given all n equations Set up matrix equation

Where

Using same least squares approach (minimizing sum of squares of residuals E) Get from Now change with and do again until convergence is reached

Example: n=14

Model it with

Choose an initial a0=1, a1=-1 Matlab demo