Generalized Linear Mixed Models

Slides:



Advertisements
Similar presentations
Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:
Advertisements

Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Qualitative predictor variables
Some Terms Y =  o +  1 X Regression of Y on X Regress Y on X X called independent variable or predictor variable or covariate or factor Which factors.
Chapter 13 Multiple Regression
Chapter 12 Multiple Regression
January 6, afternoon session 1 Statistics Micro Mini Multiple Regression January 5-9, 2008 Beth Ayers.
© 2000 Prentice-Hall, Inc. Chap Multiple Regression Models.
Multiple Regression Models. The Multiple Regression Model The relationship between one dependent & two or more independent variables is a linear function.
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Ch. 14: The Multiple Regression Model building
CORRELATIONAL RESEARCH I Lawrence R. Gordon Psychology Research Methods I.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
“I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it” Lord William Thomson, 1st.
Moderation & Mediation
Chapter 14 Introduction to Multiple Regression
Brain Mapping Unit The General Linear Model A Basic Introduction Roger Tait
ALISON BOWLING THE GENERAL LINEAR MODEL. ALTERNATIVE EXPRESSION OF THE MODEL.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Chapter 13 Multiple Regression
Lecture 4 Introduction to Multiple Regression
Categorical Independent Variables STA302 Fall 2013.
General Linear Model.
9.1 Chapter 9: Dummy Variables A Dummy Variable: is a variable that can take on only 2 possible values: yes, no up, down male, female union member, non-union.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 14-1 Chapter 14 Introduction to Multiple Regression Statistics for Managers using Microsoft.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice- Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition.
A first order model with one binary and one quantitative predictor variable.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Multiple Regression Learning Objectives n Explain the Linear Multiple Regression Model n Interpret Linear Multiple Regression Computer Output n Test.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Tutorial 5 Thursday February 14 MBP 1010 Kevin Brown.
Nemours Biomedical Research Statistics April 9, 2009 Tim Bunnell, Ph.D. & Jobayer Hossain, Ph.D. Nemours Bioinformatics Core Facility.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Topics, Summer 2008 Day 1. Introduction Day 2. Samples and populations Day 3. Evaluating relationships Scatterplots and correlation Day 4. Regression and.
Education 793 Class Notes ANCOVA Presentation 11.
SFB stats workshop Bodo Winter.
Nonparametric Statistics
The simple linear regression model and parameter estimation
Chapter 14 Introduction to Multiple Regression
Learning Objectives For two quantitative IVs, you will learn:
REGRESSION G&W p
Chapter 12 Simple Linear Regression and Correlation
CHAPTER 7 Linear Correlation & Regression Methods
B&A ; and REGRESSION - ANCOVA B&A ; and
Multiple Regression Analysis and Model Building
Correlation and regression
Multiple Regression.
John Loucks St. Edward’s University . SLIDES . BY.
Multiple Regression Example
Analysis of Covariance (ANCOVA)
Interactive Models: Two Quantitative Variables
The Least-Squares Regression Line
Econ 3790: Business and Economics Statistics
Simple Linear Regression
Simple Linear Regression
Mediation MODERATION THREE WAYS OF DOING ANALYSIS Kun
Nonparametric Statistics
Soc 3306a: ANOVA and Regression Models
Simple Linear Regression
Welcome to the class! set.seed(843) df <- tibble::data_frame(
Simple Linear Regression
DCAL Stats Workshop Bodo Winter.
OUTLINE Lecture 5 A. Review of Lecture 4 B. Special SLR Models
The Weather Turbulence
Soc 3306a Lecture 11: Multivariate 4
Ch 4.1 & 4.2 Two dimensions concept
An Introductory Tutorial
Analysis of Covariance
Presentation transcript:

Generalized Linear Mixed Models Bodo Winter

Regression with categorical predictors

Regression with categorical predictors

Regression with categorical predictors

Regression with categorical predictors

Regression with categorical predictors

Regression with categorical predictors

Regression with categorical predictors

Regression with categorical predictors

Regression with categorical predictors

Output for categorical predictors Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 109.287 2.954 37.00 <2e-16 *** genderFemale 112.727 4.178 26.98 <2e-16 *** The males are in the intercept (=109) The slope for “female” is the change with respect to the male group

Default: Treatment coding

Default: Treatment coding pitch gender 1 123.71 male 2 104.35 male 3 113.63 male 4 116.33 male 5 224.04 female 6 218.94 female 7 235.12 female 8 219.05 female

Default: Treatment coding pitch gender gender01 1 123.71 male 0 2 104.35 male 0 3 113.63 male 0 4 116.33 male 0 5 224.04 female 1 6 218.94 female 1 7 235.12 female 1 8 219.05 female 1 these are called dummy codes by default, R does “treatment coding” and assumes that the alphanumerically first element is the reference level

Changing the reference level xdata$myfac = relevel(xdata$myfac,ref=“male”) Before releveling: Estimate Std. Error t value Pr(>|t|) (Intercept) 109.287 2.954 37.00 <2e-16 *** genderFemale 112.727 4.178 26.98 <2e-16 *** After releveling: (Intercept) 222.014 2.954 75.16 <2e-16 *** myfactor.revmale -112.727 4.178 -26.98 <2e-16 ***

Other coding schemes

Other coding schemes

the intercept is now the mean of all pitch values (ignoring gender) Other coding schemes the intercept is now the mean of all pitch values (ignoring gender)

this is now the difference from the mean (=55) Other coding schemes This is called sum coding this is now the difference from the mean (=55)

Other coding schemes This is called deviation coding

Female = 0, Male = 1 (Treatment coding) Estimate Std Female = 0, Male = 1 (Treatment coding) Estimate Std. Error t value Pr(>|t|) (Intercept) 224.287 3.902 57.48 1.86e-09 *** gender01 -109.783 5.518 -19.89 1.05e-06 *** Female = -1, Male = 1 (Sum coding) (Intercept) 169.396 2.759 61.40 1.25e-09 *** gender01 -54.891 2.759 -19.89 1.05e-06 *** Female = -0.5, Male = 0.5 (Deviation coding) gender01.2 -109.783 5.518 -19.89 1.05e-06 ***

An extended linear Model Y ~ b0 + b1*X1 + b2*X2 + error coefficients predictors can be continuous or categorical and there can be (in principle) infinitely many

Interactions

Continuous * categorical interaction

Continuous * categorical interaction

Continuous * categorical interaction RT ~ Noise + Gender

Continuous * categorical interaction RT ~ Noise + Gender + Noise:Gender RT ~ Noise * Gender

Continuous * categorical interaction

Continuous * categorical interaction

Continuous * categorical interaction

Schielzeth (2010) example Schielzeth, H. (2010). Simple means to improve the interpretability of regression coefficients. Methods in Ecology and Evolution, 1(2), 103-113.

Interpreting cont * cat interactions Example: Intercept 500 Noise 2 GenderF -150 Noise:GenderF 5 Y ~ b0 + b1*X1 + b2*X2 + b3*(X1*X2) interaction term

Interpreting cont * cat interactions Example: Intercept 500 Noise 2 GenderF -150 Noise:GenderF 5 say you wanted the prediction for noise = 10 for females Y ~ b0 + b1*X1 + b2*X2 + b3*(X1*X2)

Interpreting cont * cat interactions Example: Intercept 500 Noise 2 GenderF -150 Noise:GenderF 5 say you wanted the prediction for noise = 10 for females Y ~ 500 + b1*X1 + b2*X2 + b3*(X1*X2)

Interpreting cont * cat interactions Example: Intercept 500 Noise 2 GenderF -150 Noise:GenderF 5 say you wanted the prediction for noise = 10 for females Y ~ 500 + 2*10 + b2*X2 + b3*(X1*X2)

Interpreting cont * cat interactions Example: Intercept 500 Noise 2 GenderF -150 Noise:GenderF 5 say you wanted the prediction for noise = 10 for females Y ~ 500 + 2*10 + (-150)*1 + b3*(X1*X2)

Interpreting cont * cat interactions Example: Intercept 500 Noise 2 GenderF -150 Noise:GenderF 5 say you wanted the prediction for noise = 10 for females Y ~ 500 + 2*10 + (-150)*1 + 5*(10*1) “1” for female and “10” for the noise value we wanted

Interpreting cont * cat interactions Example: Intercept 500 Noise 2 GenderF -150 Noise:GenderF 5 What about the men? Y ~ 500 + 2*10 + (-150)*1 + 5*(1*10)

Interpreting cont * cat interactions Example: Intercept 500 Noise 2 GenderF -150 Noise:GenderF 5 What about the men? Y ~ 500 + 2*10 + (-150)*0 + 5*(0*10)

Interactions between continuous variables No interaction Interaction

Interactions between continuous variables No interaction Interaction

Interactions between continuous variables No interaction Interaction

Interactions between continuous variables No interaction Interaction

Interpreting continuous interactions Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 4.75621 1.50160 3.167 0.0037 ** word_frequency -1.63550 0.87642 -1.866 0.0725 . CD 2.45615 0.03699 66.404 <2e-16 *** word_frequency:CD 1.50333 0.02119 70.943 <2e-16 *** Y ~ b0 + b1*X1 + b2*X2 + b3*(X1*X2)

Interpreting continuous interactions Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 4.75621 1.50160 3.167 0.0037 ** word_frequency -1.63550 0.87642 -1.866 0.0725 . CD 2.45615 0.03699 66.404 <2e-16 *** word_frequency:CD 1.50333 0.02119 70.943 <2e-16 *** Y ~ 4.8 + (-1.6)*X1 + 2.5*X2 + 1.5*(X1*X2) Predictions for word frequency 3 and CD 50 Y = 4.8 + (-1.6)*3 + 2.5*50 + 1.5*(3*50)

Interpreting continuous interactions Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 4.75621 1.50160 3.167 0.0037 ** word_frequency -1.63550 0.87642 -1.866 0.0725 . CD 2.45615 0.03699 66.404 <2e-16 *** word_frequency:CD 1.50333 0.02119 70.943 <2e-16 *** Y ~ 4.8 + (-1.6)*X1 + 2.5*X2 + 1.5*(X1*X2) Predictions for word frequency 3 and CD 50 Y = 801

Sign of the interaction Coefficients: Estimate (Intercept) 10 A +1 B +1 A:B +1 A and B have a positive effect on the response, and A and B together increase the response more than either one of them alone

Sign of the interaction Coefficients: Estimate (Intercept) 10 A +1 B +1 A:B -1 A and B have a positive effect on the response, but both together decrease it

Sign of the interaction Coefficients: Estimate (Intercept) 10 A -1 B -1 A:B +1 A and B have a negative effect on the response, but both together increase it

Categorical predictors: Simple vs. Main effects Simple effect = the effect of A at a specific level of B Main effect = the effect of A averaging over all levels of B

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female male

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female male

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female 260 male

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female 260 male

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female 260 260-30 male

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female 260 260-30 male

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female 260 260-30 male 260-120 260-30-120

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female 260 260-30 male 260-120 260-30-120

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female 260 260-30 male 260-120 260-30-120+15

2 X 2 Example output with treatment coding attitude = “inf” or “pol” gender = “F” or “M” Opposite sign means that the effect of politeness is smaller for males Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 informal polite female 260 230 male 140 125

Increasing interpretability with sum or deviation coding these are not main effects, they are simple effects!! Treatment coding: Estimate (Intercept) 260 attitudepol -30 genderM -120 attitudepol:genderM 15 that is, they are only the differences between specific levels (not the mean differences of attitude/gender)

How to change the coding in R For a two-level factor: contrasts(df$A) = contr.sum(2) For a three-level factor: contrasts(df$B) = contr.sum(3) etc. …

How classical tests map to their LM framework counterparts One-sample t-test against 0 lm(y ~ 1) # t.test(y, mu = 0) Two-sample t-test (unpaired / independent) lm(y ~ group) # t.test(x, y, paired = F) Paired t-test lm(differences ~ 1) # t.test(x, y, paired = T)

How classical tests map to their LM framework counterparts One-way ANOVA etc. lm(y ~ factor) # aov(y ~ x) ANCOVA lm(y ~ factor * covariate) # ?

Exercise: working memory data