Regression.

Slides:



Advertisements
Similar presentations
Agenda of Week VII Review of Week VI Multiple regression Canonical correlation.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Managerial Economics in a Global Economy
Multiple Regression Analysis
1 G Lect 4M Interpreting multiple regression weights: suppression and spuriousness. Partial and semi-partial correlations Multiple regression in.
Kin 304 Regression Linear Regression Least Sum of Squares
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Multiple Regression. Outline Purpose and logic : page 3 Purpose and logic : page 3 Parameters estimation : page 9 Parameters estimation : page 9 R-square.
Ch11 Curve Fitting Dr. Deshi Ye
Maths for Computer Graphics
Chapter 10 Simple Regression.
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Intro to Statistics for the Behavioral Sciences PSYC 1900
NOTES ON MULTIPLE REGRESSION USING MATRICES  Multiple Regression Tony E. Smith ESE 502: Spatial Data Analysis  Matrix Formulation of Regression  Applications.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Chapter Topics Types of Regression Models
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Chapter 11 Multiple Regression.
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
Ch. 14: The Multiple Regression Model building
Linear regression models in matrix terms. The regression function in matrix terms.
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
Matrix Approach to Simple Linear Regression KNNL – Chapter 5.
Intro to Matrices Don’t be scared….
CE 311 K - Introduction to Computer Methods Daene C. McKinney
Variance and covariance Sums of squares General linear models.
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
Lecture 5 Correlation and Regression
Objectives of Multiple Regression
Chapter 13: Inference in Regression
Understanding Multivariate Research Berry & Sanders.
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Ordinary Least Squares Regression.
Multiple Linear Regression Partial Regression Coefficients.
Chapter 13 Multiple Regression
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
1 G Lect 4W Multiple regression in matrix terms Exploring Regression Examples G Multiple Regression Week 4 (Wednesday)
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
The simple linear regression model and parameter estimation
Simple Linear Regression
REGRESSION G&W p
Review Guess the correlation
Regression 11/6.
Regression 10/29.
Regression Assumptions of OLS.
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
CHAPTER 29: Multiple Regression*
LESSON 24: INFERENCES USING REGRESSION
6-1 Introduction To Empirical Models
OVERVIEW OF LINEAR MODELS
Simple Linear Regression
OVERVIEW OF LINEAR MODELS
数据的矩阵描述.
Presentation transcript:

Regression

Remembering the basics of linear regression Hypothesis: a variable(s) x (x1, x2, x3, …) cause(s) another variable y Corollary: x can be used to partially predict y Mathematical implication: This is mimimal and random This is linear

Example

Example: Regression approach Error in prediction is minimal, and random

The way these assumptions look Child’s IQ = 20.99 + 0.78*Mother’s IQ

Prediction Child’s IQ = 20.99 + 0.78*Mother’s IQ Predicted Case 3 IQ = 20.99 + 0.78*110 Predicted Case 3 IQ = 106.83 Actual Case 3 IQ = 102

Example

Example

Example

How are the regression coefficients computed? MINIMIZE SQUARED DEVIATIONS BETWEEN ACTUAL AND PREDICTED VALUES ε is “minimal” and random

Interpreting coefficients

Error in estimation of b The estimate of b will differ from sample to sample. There is sampling error in the estimate of b. b is not equal to the population value of the slope (B). If we take many many simple random samples and estimate b many many times ….

Standard error of b

R2 = + 1 = + 1- Sum of squared errors of regression Sum of squared deviations from the mean only Variance due to regression Total variance Error variance = + Proportion of variance due to regression Proportion of variance due to error 1 = +

Multiple Regression

Multiple regression Purpose: Include “relevant” predictors Reduce error in prediction Test hypotheses about related independent variables

Matrices

Questions about matrices If A is a 3*4 matrix, what is the dimensions of A’ (A transposed)? If A is a 4*5 matrix and B is a column vector with 5 elements (5*1), what are the dimensions of C=A*B? If A is a 3*2 matrix, what are the dimensions of A-1?

Question

Answer

Vectors and Matrices

Vectors and Matrices

Multiple regression – OLS estimation

Multiple regression – OLS estimation k*1 n*1 n*1 n*k

Multiple regression – OLS estimation (k*1)  {(k*n)*(n*k)} {(k*n)*(n*1)} (k*1)  {(k*k)} * {(k*1)}

Multiple regression – OLS estimation Information matrix : each element is a cross-product term

OLS estimation: important points Information matrix is inverted Information matrix cannot be “singular.” Information matrix cannot have any two rows or columns that are (almost) identical. Every regression coefficient depends on the entire information matrix. Every regression coefficient depends on the covariances of all Xs with Y.

Assumptions of multiple regression Equal probability of selection (SRS) Linearity Independence of observations: Errors are uncorrelated The mean of error term is ALWAYS zero: Mean does not depend on x. Normality Homoskedasticity Variance does not depend on x. No multicollinearity