Statistics 3: mixed effect models Install R library lme4 to your computer: 1.R -> Packages -> Install packages 2.Choose mirror 3.Choose lme4 4.Open the.

Slides:



Advertisements
Similar presentations
Randomized Complete Block and Repeated Measures (Each Subject Receives Each Treatment) Designs KNNL – Chapters 21,
Advertisements

Lecture 17: Tues., March 16 Inference for simple linear regression (Ch ) R2 statistic (Ch ) Association is not causation (Ch ) Next.
Multiple Comparisons in Factorial Experiments
Objectives 10.1 Simple linear regression
Experiments with both nested and “crossed” or factorial factors
Probability & Statistical Inference Lecture 9
Correlation and regression
1 BA 275 Quantitative Business Methods Residual Analysis Multiple Linear Regression Adjusted R-squared Prediction Dummy Variables Agenda.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Stat Today: Multiple comparisons, diagnostic checking, an example After these notes, we will have looked at (skip figures 1.2 and 1.3, last.
Analysis of Individual Variables Descriptive – –Measures of Central Tendency Mean – Average score of distribution (1 st moment) Median – Middle score (50.
Business Statistics - QBM117 Statistical inference for regression.
Stat 112: Notes 2 This class: Start Section 3.3. Thursday’s class: Finish Section 3.3. I will and post on the web site the first homework tonight.
Correlation and Regression Analysis
Correlation Coefficients Pearson’s Product Moment Correlation Coefficient  interval or ratio data only What about ordinal data?
Statistical hypothesis testing – Inferential statistics II. Testing for associations.
Quantitative Business Analysis for Decision Making Multiple Linear RegressionAnalysis.
Regression and Correlation Methods Judy Zhong Ph.D.
5-1 Introduction 5-2 Inference on the Means of Two Populations, Variances Known Assumptions.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Montecarlo Simulation LAB NOV ECON Montecarlo Simulations Monte Carlo simulation is a method of analysis based on artificially recreating.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Regression. Population Covariance and Correlation.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Regression Regression relationship = trend + scatter
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Mixed Effects Models Rebecca Atkins and Rachel Smith March 30, 2015.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
Generalized linear MIXED models
Single-Factor Studies KNNL – Chapter 16. Single-Factor Models Independent Variable can be qualitative or quantitative If Quantitative, we typically assume.
AP STATISTICS LESSON 14 – 1 ( DAY 1 ) INFERENCE ABOUT THE MODEL.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
ANOVA Overview of Major Designs. Between or Within Subjects Between-subjects (completely randomized) designs –Subjects are nested within treatment conditions.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 17 Simple Linear Regression and Correlation.
Introduction to Multilevel Analysis Presented by Vijay Pillai.
Establishing Plots to Monitor Growth and Treatment Response Some do’s and don’ts A discussion.
Inference for Regression
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Lecture Slides Elementary Statistics Twelfth Edition
The simple linear regression model and parameter estimation
Inference for Regression
Linear Mixed Modelling Using R
Regression Analysis AGEC 784.
Modeling in R Sanna Härkönen.
AP Statistics Chapter 14 Section 1.
Linear Mixed Models in JMP Pro
(Residuals and
S519: Evaluation of Information Systems
Simple Linear Regression - Introduction
Regression Analysis Week 4.
CHAPTER 29: Multiple Regression*
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Randomized Complete Block and Repeated Measures (Each Subject Receives Each Treatment) Designs KNNL – Chapters 21,
Simple Linear Regression
Multiple Regression Chapter 14.
Adequacy of Linear Regression Models
Joe Brehm, Mariel Boldis, Steven Bristow, and Janyne Little
Adequacy of Linear Regression Models
The Analysis of Variance
Chapter 14 Inference for Regression
Adequacy of Linear Regression Models
A protocol for data exploration to avoid common statistical problems
Adequacy of Linear Regression Models
Inferences 10-3.
Presentation transcript:

Statistics 3: mixed effect models Install R library lme4 to your computer: 1.R -> Packages -> Install packages 2.Choose mirror 3.Choose lme4 4.Open the library by the command library(lme4)

General and generalized linear models assume that error terms are independent. Y ~ a + b*X + ε This assumption can often be violated: -year effect -site/block effect -repeated measures from the same measurement unit, i.e. individual

Regression line: Height ~ Diameter Tree height and diameter measured in two forest plots Residuals not independent nor normally distributed.

Correlation within groups can be accounted by random effects Y = a + b 1 x 1 + … + b n x n + η + ε Normally distributed residuals Random effects account for potential associations/correlations within groups of observations. Linear mixed effect model:

Distinction between fixed and random effects Random effect is a source of uncontrollable variation among groups –Treatment or exposure level is controllable –Site or year effect is uncontrollable –Also if data is collected by several scientist, differences (‘human error’) can be accounted by random effects Fixed effects are those we want to do inference about and predict Random effects are those we just want to ‘take care of’ –Random effects organize random noise in an appropriate manner –Random effects account for the internal correlations structure of the data If experiment was replicated, fixed effects would be the same but random effects would be different

Fixed or random? Does one variety of barley grow faster than the other? –Variety of barley –Field Do fish swim faster in high temperature? –Temperature –Fish ID Do migratory birds return earlier if mean winter temperature is higher? –Temperature –Year Sometimes making the distinction can be difficult

R syntax for random effect Linear mixed effect model can be fitted through library(nlme) lme(y~x+z,random=~1|’random variable’,data) library(lme4) lmer(y~x+z+(1|’random varible’),data) Generalized linear mixed effect models can be fitted using library(MASS) glmmPQL(y~x+z,random=~1|’random variable’,data,family) library(lme4) glmer(y~x+z+(1|’random varible’),data,family) DEMO 1

Nested random effects Random effects are sometimes nested –Different error terms for different levels in the design Example: Plant growth measured in an experiment that conducted in three glasshouses, both containing six growth plates –Plate nested in glasshouse: Glasshouse 2Glasshouse 3Glasshouse

More R syntax Simple random effects: random = ~1 -> each group have its own intercept e.g. ~1|ID Nested random effects: random = ~1| y / x -> random intercept for x nested in y e.g. ~1|glasshouse/plate Glasshouse effect Plate effect within glasshouse effect Exercises