PSY 1950 General Linear Model November 12, 2008. The General Linear Model Or, What the Hell ’ s Going on During Estimation?

Slides:



Advertisements
Similar presentations
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Advertisements

Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Chapter 10 Curve Fitting and Regression Analysis
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
Objectives (BPS chapter 24)
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
Predictive Analysis in Marketing Research
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Multiple Regression Research Methods and Statistics.
Notes on Weighted Least Squares Straight line Fit Passing Through The Origin Amarjeet Bhullar November 14, 2008.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Summary of Quantitative Analysis Neuman and Robson Ch. 11
Simple Linear Regression Analysis
Relationships Among Variables
Leedy and Ormrod Ch. 11 Gray Ch. 14
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
ALISON BOWLING THE GENERAL LINEAR MODEL. ALTERNATIVE EXPRESSION OF THE MODEL.
Analysis of Covariance, ANCOVA (GLM2)
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Regression. Population Covariance and Correlation.
MBP1010H – Lecture 4: March 26, Multiple regression 2.Survival analysis Reading: Introduction to the Practice of Statistics: Chapters 2, 10 and 11.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Simple Linear Regression ANOVA for regression (10.2)
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Chapter 13 Multiple Regression
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Environmental Modeling Basic Testing Methods - Statistics III.
Categorical Independent Variables STA302 Fall 2013.
General Linear Model.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
Multiple Regression Learning Objectives n Explain the Linear Multiple Regression Model n Interpret Linear Multiple Regression Computer Output n Test.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Jump to first page Inferring Sample Findings to the Population and Testing for Differences.
Biostatistics Regression and Correlation Methods Class #10 April 4, 2000.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Multivariate Statistics.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 9 l Simple Linear Regression 9.1 Simple Linear Regression 9.2 Scatter Diagram 9.3 Graphical.
29 October 2009 MRC CBU Graduate Statistics Lectures 4: GLM: The General Linear Model - ANOVA & ANCOVA1 MRC Cognition and Brain Sciences Unit Graduate.
Correlation & Simple Linear Regression Chung-Yi Li, PhD Dept. of Public Health, College of Med. NCKU 1.
Stats Methods at IC Lecture 3: Regression.
Notes on Weighted Least Squares Straight line Fit Passing Through The Origin Amarjeet Bhullar November 14, 2008.
Regression and Correlation of Data Summary
Modeling in R Sanna Härkönen.
REGRESSION G&W p
Decomposition of Sum of Squares
REGRESSION (R2).
Analysis of Covariance, ANCOVA (GLM2)
Multiple Regression.
Comparing Several Means: ANOVA
3 basic analytical tasks in bivariate (or multivariate) analyses:
Regression and Correlation of Data
Presentation transcript:

PSY 1950 General Linear Model November 12, 2008

The General Linear Model Or, What the Hell ’ s Going on During Estimation?

Generalized linear models General linear model Multiple regression Simple regression ANOVA

Motivation Benefits to GLM approach over variance- ratio method –Efficiency –Easier in the case of unequal sample sizes –ANCOVA –Present and future statistical techniques and software Goal: vague understanding

History Correlational versus experimental approach –Correlational: Does value of Y change with value of X? –Experimental: Does mean value of Y change with category of X? Computer revolution –ANOVA/regression calculations based on matrix algreba

General Linear Model General model –Categorical or continuous predictor Linear model –Parameters that are not multiplied by other parameters, e.g., not Y = abX only first-power, e.g., NOT Y = b 2 X not exponents, e.g., NOT Y = X b –Variables do not need to satisfy the above criteria transformation workaround, e.g., Y = bX 2 can be rewritten as Y = bZ –Not necessary straight-line relationship

The Foundation of Statistics datum = model + error

The Simplest Model datum = mean + error

Regression Model Y = bX + a + e

The Foundation of Statistics datum – model = error

Method of Least Squares ∑(datum – model) 2 = ∑error 2

Method of Least Squares Regression ∑(Y – bX – a) 2 = ∑e 2 ∑(Y – Y) 2 = ∑e 2 ^

One-way ANOVA as GLM datum = model + error ^ ^

By comparing the error estimates made by both these models, we can assess how much variance the grouping parameter explained Reduced ModelFull Model

t-Test as Regression Y = bX + a + e Code grouping variable (X) as 0/1

Example Libido = b(Viagra) + a + e

t-Test as Regression What do the following represent? –slope value –intercept value –slope p-value –intercept p-value –model significance How would the above change if you coded the groups differently? 1, 0 -1, 1 1, 2

One-Way ANOVA as Regression Y = b 1 X 1 + b 2 X 2 + … b p X p + a + e Number of predictors (p) is one less than the number of groups (k) –Just as one predictor distinguishes two groups, two predictors distinguish three groups, and so on … –Any redundancy creates multiple solutions to least squares minimization

Example Libido = b 1 (Viagra) + b 2 (High) + a + e

What do the following represent? –regression coefficients –intercept values –coefficient p-value –intercept p-value –model significance How would the above change if you coded the groups differently? original: 00, 10, 01 00, 20, 02 01, 10, , 10, 01

Coding Dummy –Comparison group coded will all zeros –Variables named after conditions of interest –Data coded 1/0 based on membership Effect –One group coded with all “-1”s Can be uninteresting conceptually or based upon proximity of its mean to grand mean –Variables named after conditions of interest –Data coded 1/0 based on membership Contrast –Data coded based upon expected pattern (e.g., 1, 1, -2) –Sum of weights must equal zero –Additional variables must represent orthogonal contrasts

Coding