Statistics 350 Lecture 13
Today Last Day: Some Chapter 4 and start Chapter 5 Today: Some matrix results Mid-Term Friday…..Sections ; ; (READ)
Matrices Let A be a square matrix The inverse of A is:
Matrices If A contains any linear dependencies, then We will deal mainly with non-singular matrices
Matrices A special application is the model matrix for simple linear regression:
Matrices Other useful results:
Random Vectors A vector of random variables is called a random vector Expectation
Random Vectors If A is a vector of constants, the E(A)= If A is a matrix of constants and Y is a random vector, then E(AY )=
Random Vectors The variance-covariance matrix of Y is: If A is a vector of constants, its variance-covariance is
Random Vectors If A is a matrix of constants and Y is a random vector, then 2 (AY )=
Simple Linear Regression The model is: E( )= 2 ( )=
Simple Linear Regression E(Y) 2 (Y)=
Simple Linear Regression Now, how do represent the least squares estimation in matrix notation?