NOTES ON MULTIPLE REGRESSION USING MATRICES  Multiple Regression Tony E. Smith ESE 502: Spatial Data Analysis  Matrix Formulation of Regression  Applications.

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

12-1 Multiple Linear Regression Models Introduction Many applications of regression analysis involve situations in which there are more than.
The General Linear Model. The Simple Linear Model Linear Regression.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Maths for Computer Graphics
Variance and covariance M contains the mean Sums of squares General additive models.
Motion Analysis Slides are from RPI Registration Class.
Statistics 350 Lecture 14. Today Last Day: Matrix results and Chapter 5 Today: More matrix results and Chapter 5 Please read Chapter 5.
Matrix Operations. Matrix Notation Example Equality of Matrices.
Psychology 202b Advanced Psychological Statistics, II January 18, 2011.
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Unsorted Treatments Random Numbers Sorted Sorted Experimental Treatments Random Units Numbers.
Linear and generalised linear models
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
Linear and generalised linear models Purpose of linear models Least-squares solution for linear models Analysis of diagnostics Exponential family and generalised.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Linear regression models in matrix terms. The regression function in matrix terms.
Matrix Approach to Simple Linear Regression KNNL – Chapter 5.
Intro to Matrices Don’t be scared….
CE 311 K - Introduction to Computer Methods Daene C. McKinney
Lecture 10A: Matrix Algebra. Matrices: An array of elements Vectors Column vector Row vector Square matrix Dimensionality of a matrix: r x c (rows x columns)
Variance and covariance Sums of squares General linear models.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Today Wrap up of probability Vectors, Matrices. Calculus
A vector can be interpreted as a file of data A matrix is a collection of vectors and can be interpreted as a data base The red matrix contain three column.
Least-Squares Regression
Ordinary Least-Squares Emmanuel Iarussi Inria. Many graphics problems can be seen as finding the best set of parameters for a model, given some data Surface.
Chapter 12 Multiple Linear Regression Doing it with more variables! More is better. Chapter 12A.
Statistics and Linear Algebra (the real thing). Vector A vector is a rectangular arrangement of number in several rows and one column. A vector is denoted.
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Matrix Algebra and Regression a matrix is a rectangular array of elements m=#rows, n=#columns  m x n a single value is called a ‘scalar’ a single row.
Multiple Regression I KNNL – Chapter 6. Models with Multiple Predictors Most Practical Problems have more than one potential predictor variable Goal is.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Curve-Fitting Regression
Regression. Population Covariance and Correlation.
Linear Algebra 1.Basic concepts 2.Matrix operations.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Introduction to Matrices and Matrix Approach to Simple Linear Regression.
Geology 5670/6670 Inverse Theory 21 Jan 2015 © A.R. Lowry 2015 Read for Fri 23 Jan: Menke Ch 3 (39-68) Last time: Ordinary Least Squares Inversion Ordinary.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Special Topic: Matrix Algebra and the ANOVA Matrix properties Types of matrices Matrix operations Matrix algebra in Excel Regression using matrices ANOVA.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae.
Statistics……revisited
Chap 5 The Multiple Regression Model
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
1.7 Linear Independence. in R n is said to be linearly independent if has only the trivial solution. in R n is said to be linearly dependent if there.
Section 1.7 Linear Independence and Nonsingular Matrices
Matrices and Matrix Operations. Matrices An m×n matrix A is a rectangular array of mn real numbers arranged in m horizontal rows and n vertical columns.
1 G Lect 4W Multiple regression in matrix terms Exploring Regression Examples G Multiple Regression Week 4 (Wednesday)
MathematicalMarketing Slide 5.1 OLS Chapter 5: Ordinary Least Square Regression We will be discussing  The Linear Regression Model  Estimation of the.
Statistics 350 Lecture 13. Today Last Day: Some Chapter 4 and start Chapter 5 Today: Some matrix results Mid-Term Friday…..Sections ; ;
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Statistics 350 Review. Today Today: Review Simple Linear Regression Simple linear regression model: Y i =  for i=1,2,…,n Distribution of errors.
Matrices. Variety of engineering problems lead to the need to solve systems of linear equations matrixcolumn vectors.
張智星 (Roger Jang) 台大資工系 多媒體檢索實驗室
Linear Algebra Lecture 2.
Review of Matrix Operations
ELG5377 Adaptive Signal Processing
Probability Theory and Parameter Estimation I
CH 5: Multivariate Methods
Regression.
Matrices Definition: A matrix is a rectangular array of numbers or symbolic elements In many applications, the rows of a matrix will represent individuals.
J.-F. Pâris University of Houston
5.4 General Linear Least-Squares
OVERVIEW OF LINEAR MODELS
Nonlinear Fitting.
Topic 11: Matrix Approach to Linear Regression
Regression Models - Introduction
Presentation transcript:

NOTES ON MULTIPLE REGRESSION USING MATRICES  Multiple Regression Tony E. Smith ESE 502: Spatial Data Analysis  Matrix Formulation of Regression  Applications to Regression Analysis

SIMPLE LINEAR MODEL  Data:  Parameters:  Model:

SIMPLE REGRESSION ESTIMATION  Data Points:  Predicted Value:  Estimate Conditional Mean:    Line of Best Fit where:

STANDARD LINEAR MODEL  Data:  Parameters:  Model:

STANDARD LINEAR MODEL (k = 2)  Data:  Parameters:  Model:

REGRESSION ESTIMATION (for k =2) Plane of Best Fit     Data Points:  Predicted Value: where:

MATRIX REPRESENTATION OF THE STANDARD LINEAR MODEL  Vectors and Matrices:  Matrix Reformulation of the Model:

LINEAR TRANSFORMATIONS IN ONE DIMENSION  Linear Function:  Graphic Depiction:   

LINEAR TRANSFORMATIONS IN TWO DIMENSIONS  Linear Transformation:

 Graphical Depiction of Linear Transformation:        

SOME MATRIX CONVENTIONS  Transposes of Vectors and Matrices:  Symmetric (Square) Matrices:  Important Example:

 Column Representation of Matrices:  Row Representation of Matrices:

 Matrix Multiplication:  Inner Product of Vectors:  Transposes:

MATRIX REPRESENTATIONS OF LINEAR TRANSFORMATIONS  For any Two-Dimensional Linear Transformation : with :

 Graphical Depiction of Matrix Representation:        

 Inversion of Square Matrices (as Linear Transformations):

DETERMINANTS OF SQUARE MATRICES      

NONSINGULAR SQUARE MATRICES      

LEAST-SQUARES ESTIMATION  General Sum-of-Squares:  General Regression Matrices:

DIFFERENTIATION OF FUNCTIONS  General Derivative:  Example:   

PARTIAL DERIVATIVES  

VECTOR DERIVATIVES  Derivative Notation for:  Gradient Vector:

TWO IMPORTANT EXAMPLES  Linear Functions:  Quadratic Functions:

 Quadratic Derivatives:  Symmetric Case:

MINIMIZATION OF FUNCTIONS  First-Order Condition:  Example:  

   TWO-DIMENSIONAL MINIMIZATION

LEAST SQUARES ESTIMATION  Solution for:

NON-MATRIX VERSION (k = 2)  Data:  Beta Estimates:

EXPECTED VALUES OF RANDOM MATRICES  Random Vectors and Matrices  Expected Values:

EXPECTATIONS OF LINEAR FUNCTIONS OF RANDOM VECTORS  Linear Combinations  Linear Transformations

EXPECTATIONS OF LINEAR FUNCTIONS OF RANDOM MATRICES  Left Multiplication  Right Multiplication (by symmetry of inner products):

COVARIANCE OF RANDOM VECTORS  Random Variables :  Random Vectors:

COVARIANCE OF LINEAR FUNCTIONS OF RANDOM VECTORS  Linear Combinations:  Linear Transformations: ( Right Mult ) ( Left Mult )

TRANSLATIONS OF RANDOM VECTORS  Translation:  Means:  Covariances:

RESIDUAL VECTOR IN THE STANDARD LINEAR MODEL  Linear Model Assumption:  Residual Means:  Residual Covariances:

MOMENTS OF BETA ESTIMATES  Linear Model:  Mean of Beta Estimates: (Unbiased Estimator)  Covariance of Beta Estimates:

ESTIMATION OF RESIDUAL VARIANCE  Residual Variance :  Residual Estimates :  Natural Estimate of Variance :  Bias-Correct Estimate of Variance : (Compensates for Least Squares)

ESTIMATION OF BETA COVARIANCE  Beta Covariance Matrix:  Beta Covariance Estimates: