Lecture 4 Linear random coefficients models. Rats example 30 young rats, weights measured weekly for five weeks Dependent variable (Y ij ) is weight for.

Slides:



Advertisements
Similar presentations
Hierarchical Linear Modeling: An Introduction & Applications in Organizational Research Michael C. Rodriguez.
Advertisements

Chapter 12 Inference for Linear Regression
Inference for Regression
SC968: Panel Data Methods for Sociologists Random coefficients models.
Objectives (BPS chapter 24)
2005 Hopkins Epi-Biostat Summer Institute1 Module 2: Bayesian Hierarchical Models Francesca Dominici Michael Griswold The Johns Hopkins University Bloomberg.
Graphs in HLM. Model setup, Run the analysis before graphing Sector = 0 public school Sector = 1 private school.
School of Veterinary Medicine and Science Multilevel modelling Chris Hudson.
Regression and Correlation
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
Chapter Topics Types of Regression Models
Linear Regression and Correlation Analysis
Chapter 7 Regression and Correlation Analyses Instructor: Prof. Wilson Tang Instructor: Prof. Wilson Tang CIVL 181 Modelling Systems with Uncertainties.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
7/2/ Lecture 51 STATS 330: Lecture 5. 7/2/ Lecture 52 Tutorials  These will cover computing details  Held in basement floor tutorial lab,
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Correlation and Regression Analysis
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
T-tests and ANOVA Statistical analysis of group differences.
Correlation & Regression
Regression and Correlation Methods Judy Zhong Ph.D.
Inference for regression - Simple linear regression
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
Lecture 2 Basic Bayes and two stage normal normal model…
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
2006 Hopkins Epi-Biostat Summer Institute1 Module 2: Bayesian Hierarchical Models Instructor: Elizabeth Johnson Course Developed: Francesca Dominici and.
AP Statistics Section 15 A. The Regression Model When a scatterplot shows a linear relationship between a quantitative explanatory variable x and a quantitative.
Inference for Linear Regression Conditions for Regression Inference: Suppose we have n observations on an explanatory variable x and a response variable.
1 Experimental Statistics - week 10 Chapter 11: Linear Regression and Correlation.
Introduction Multilevel Analysis
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Introduction to Linear Regression
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Multilevel Data in Outcomes Research Types of multilevel data common in outcomes research Random versus fixed effects Statistical Model Choices “Shrinkage.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Lecture 3 Linear random intercept models. Example: Weight of Guinea Pigs Body weights of 48 pigs in 9 successive weeks of follow-up (Table 3.1 DLZ) The.
ITEC6310 Research Methods in Information Technology Instructor: Prof. Z. Yang Course Website: c6310.htm Office:
Aim: Review for Exam Tomorrow. Independent VS. Dependent Variable Response Variables (DV) measures an outcome of a study Explanatory Variables (IV) explains.
Copyright ©2011 Brooks/Cole, Cengage Learning Inference about Simple Regression Chapter 14 1.
September 18-19, 2006 – Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development Conducting and interpreting multivariate analyses.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
Lecture 10: Correlation and Regression Model.
Linear correlation and linear regression + summary of tests Dr. Omar Al Jadaan Assistant Professor – Computer Science & Mathematics.
LECTURE 9 Tuesday, 24 FEBRUARY STA291 Fall Administrative 4.2 Measures of Variation (Empirical Rule) 4.4 Measures of Linear Relationship Suggested.
Chapter 3-Examining Relationships Scatterplots and Correlation Least-squares Regression.
Chapter 14: Inference for Regression. A brief review of chapter 4... (Regression Analysis: Exploring Association BetweenVariables )  Bi-variate data.
SWBAT: Calculate and interpret the residual plot for a line of regression Do Now: Do heavier cars really use more gasoline? In the following data set,
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
December 2010T. A. Louis: Basic Bayes 1 Basic Bayes.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
AP Statistics Section 15 A. The Regression Model When a scatterplot shows a linear relationship between a quantitative explanatory variable x and a quantitative.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
Chapter 8 Linear Regression.
Inference for Regression
Module 2: Bayesian Hierarchical Models
Correlation & Regression
AP Statistics Chapter 14 Section 1.
LECTURE 13 Thursday, 8th October
I271B Quantitative Methods
CHAPTER 29: Multiple Regression*
No notecard for this quiz!!
CHAPTER 12 More About Regression
Ch 4.1 & 4.2 Two dimensions concept
Longitudinal Data & Mixed Effects Models
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Lecture 4 Linear random coefficients models

Rats example 30 young rats, weights measured weekly for five weeks Dependent variable (Y ij ) is weight for rat “i” at week “j” Data: Multilevel: weights (observations) within rats (clusters)

Individual & population growth Pop line (average growth) Individual Growth Lines Rat “i” has its own expected growth line: There is also an overall, average population growth line: Weight Study Day (centered)

Improving individual-level estimates Possible Analyses 1.Each rat (cluster) has its own line: intercept= b i0, slope= b i1 2.All rats follow the same line: b i0 =  0, b i1 =  1 3.A compromise between these two: Each rat has its own line, BUT… the lines come from an assumed distribution E(Y ij | b i0, b i1 ) = b i0 + b i1 X j b i0 ~ N(  0,  0 2 ) b i1 ~ N(  1,  1 2 ) “Random Effects”

Pop line (average growth) Bayes-Shrunk Individual Growth Lines A compromise: Each rat has its own line, but information is borrowed across rats to tell us about individual rat growth Weight Study Day (centered)

Bayes-Shrunk Growth Lines Bayesian paradigm provides methods for “borrowing strength” or “shrinking” Pop line (average growth) Weight Study Day (centered) Pop line (average growth) Study Day (centered) Weight Individual Growth Lines Bayes

Inner-London School data: How effective are the different schools? (gcse.dat,Chap 3) Outcome: score exam at age 16 (gcse) Data are clustered within schools Covariate: reading test score at age 11 prior enrolling in the school (lrt) Goal: to examine the relationship between the score exam at age 16 and the score at age 11 and to investigate how this association varies across schools

Fig 3.1: Scatterplot of gcse vs lrt for school 1 with regression line)

Linear regression model with random intercept and random slope centered

Alternative Representation Linear regression model with random intercept and random slope

Fig 3.3: Scatterplot of intercepts and slopes for all schools with at least 5 students

Linear regression model with random intercept and random slope The total residual variance is said to be heteroskedastic because depends on x Model with random intercept only

Empirical Bayes Prediction (xtmixed reff*,reffects) In stata we can calculate: EB: borrow strength across schools MLE: DO NOT borrow strength across Schools

Random InterceptRandom Intercept and Slope EstSEEstSE _cons lrt Random xtmixed Tau_ Tau_ Rho_ Sigma gllamm Tau_11^ Tau_22^ Tau_ Sigma^ Correlation between random effects Between Schools variance Within school variance

Fig 3.9: Scatter plot of EB versus ML estimates

Fig 3.10: EB predictions of school-specific lines

Random Intercept EB estimates and ranking (Fig 3.11)

Growth-curve modelling (asian.dta) Measurements of weight were recorded for children up to 4 occasions at 6 weeks, and then at 8,12, and 27 months Goal: We want to investigate the growth trajectories of children’s weights as they get older Both shape of the trajectories and the degree of variability are of interest

Fig 3.12: Observed growth trajectories for boys and girls

What we see in Fig 3.12? Growth trajectories are not linear We will model this by including a quadratic term for age Some children are consistent heavier than others, so a random intercept appears to be warranted

Quadratic growth model with random intercept and random slope Fixed effectsRandom effects Random effects are multivariate normal with means 0, standard deviations tau_11 and tau_22 and covariance tau_12

Random InterceptRandom Intercept and Slope EstSEEstSE _cons Age Age^ Random Tau_ Tau_ Rho_ Sigma Random intercept standard deviation Level-1 residual standard deviation Results for Quadratic Growth Random Effects Model Correlation between baseline and linear random effects….

Two-stage model formulation Stage 1 Stage 2 Fixed EffectsRandom Effects

Results from Random intercept and slope model with and without inclusion of gender effect Random Intercept and Slope EstSEEstSE _cons Age Age^ Girl Girl*Age Random Tau_ Tau_ Rho_ Sigma