Biostatistics-Lecture 10 Linear Mixed Model Ruibin Xi Peking University School of Mathematical Sciences.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

GENERAL LINEAR MODELS: Estimation algorithms
Chapter 12 Simple Linear Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Objectives (BPS chapter 24)
The General Linear Model. The Simple Linear Model Linear Regression.
Session 2. Applied Regression -- Prof. Juran2 Outline for Session 2 More Simple Regression –Bottom Part of the Output Hypothesis Testing –Significance.
What makes one estimator better than another Estimator is jargon term for method of estimating.
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Multivariate Data Analysis Chapter 4 – Multiple Regression.
Estimation of parameters. Maximum likelihood What has happened was most likely.
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
T-test.
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Lecture 7 1 Statistics Statistics: 1. Model 2. Estimation 3. Hypothesis test.
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
Analysis of Individual Variables Descriptive – –Measures of Central Tendency Mean – Average score of distribution (1 st moment) Median – Middle score (50.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
1 A MONTE CARLO EXPERIMENT In the previous slideshow, we saw that the error term is responsible for the variations of b 2 around its fixed component 
Simple Linear Regression Analysis
Biostatistics-Lecture 9 Experimental designs Ruibin Xi Peking University School of Mathematical Sciences.
Lecture 5 Correlation and Regression
Introduction to Multilevel Modeling Using SPSS
Chapter 11 Simple Regression
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C22: The Method of Least Squares.
Multiple Regression The Basics. Multiple Regression (MR) Predicting one DV from a set of predictors, the DV should be interval/ratio or at least assumed.
Multilevel Linear Models Field, Chapter 19. Why use multilevel models? Meeting the assumptions of the linear model – Homogeneity of regression coefficients.
P ROBABLITY S TATICS &. PROJECT. 1 Assuming that the error terms are distributed as: Please derive the maximum likelihood estimator for the simple linear.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Mixed Effects Models Rebecca Atkins and Rachel Smith March 30, 2015.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Maximum likelihood estimators Example: Random data X i drawn from a Poisson distribution with unknown  We want to determine  For any assumed value of.
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Inference about the slope parameter and correlation
Chapter 11: Linear Regression and Correlation
Statistics 350 Lecture 3.
Simple Linear Regression
Probability Theory and Parameter Estimation I
Ch3: Model Building through Regression
Linear Mixed Models in JMP Pro
Simple Linear Regression - Introduction
CIS 2033 based on Dekking et al
G Lecture 6 Multilevel Notation; Level 1 and Level 2 Equations
Inference about the Slope and Intercept
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
Estimates and 95% CIs of between- and within-pair variations for SS and OS twin pairs and achievement test scores in mathematics and reading assessed in.
Inference about the Slope and Intercept
Statistical Assumptions for SLR
Linear Hierarchical Modelling
Simple Linear Regression
Simple Linear Regression
OVERVIEW OF LINEAR MODELS
Chapter 14 Inference for Regression
Linear Hierarchical Models
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
Presentation transcript:

Biostatistics-Lecture 10 Linear Mixed Model Ruibin Xi Peking University School of Mathematical Sciences

An example Zuur et al. (2007) measured marine benthic data – 9 inter-tidal area were measured – 5 samples were taken in each area Question: whether species richness is related to – NAP: the height of the sampling station compared to the mean tidal level – Exposure: an index composed of wave action, length of the surf zone, slope, grain size, and the depth of the anaerobic layer.

An example Zuur et al. (2007) measured marine benthic data – 9 inter-tidal area were measured – 5 samples were taken in each area Question: whether species richness is related to – NAP: the height of the sampling station compared to the mean tidal level – Exposure: an index composed of wave action, length of the surf zone, slope, grain size, and the depth of the anaerobic layer.

An example Zuur et al. (2007) measured marine benthic data – 9 inter-tidal area were measured – 5 samples were taken in each area Question: whether species richness is related to – NAP: the height of the sampling station compared to the mean tidal level – Exposure: an index composed of wave action, length of the surf zone, slope, grain size, and the depth of the anaerobic layer.

A first model We may first try the linear model

2-stage Analysis method 1 st stage: model species richness and NAP for each beach We can get beta as

2-stage analysis method 2 nd stage: the estimated regression coefficient are modeled as a function of exposure

2-stage analysis method Disadvantage: – Summarize all data from a beach with one parameter – In the 2 nd step, we are analyzing the regression parameters, not the observed data (not modeling the variable of interest) – The number of observations used to calculate the summary statistic is not used in the 2 nd step.

Linear Mixed effect model The model Fixed effect term Random effect term

The random intercept model Model the richness as a linear function of NAP – Intercept is allowed to change per beach

The random intercept model

The random intercept and slope model The model

The random intercept and slope model

Induced correlation The model Marginal distribution of Y is In case of the random intercept model, we have

Induced correlation The model Marginal distribution of Y is In case of the random intercept model, we have

Induced correlation For the random intercept and slope model, we have

The marginal model If we integrate out b, we get In general, if we don’t have the random effects, General correlation matrix

The marginal model If we integrate out b, we get In general, if we don’t have the random effects, Compound symmetric structure

REML REML: restricted maximum likelihood estimate In the simple linear regression An unbiased estimate of the variance is But the MLE is Biased!!

REML REML works as follows Written in matrix form The normality assumption implies Take A of dimension n×n-2, such that A’ and X are orthogonal, Maximize this we get REML

REML For the linear mixed model Putting all observations together Similarly, we can take A with Thus

REML