Questions From Yesterday

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

Multiple Regression and Model Building
Hierarchical Linear Modeling: An Introduction & Applications in Organizational Research Michael C. Rodriguez.
Kin 304 Regression Linear Regression Least Sum of Squares
Forecasting Using the Simple Linear Regression Model and Correlation
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Econ 140 Lecture 151 Multiple Regression Applications Lecture 15.
More on linear regression – regression to the mean  Baseball Examples from Web Quartiles 1 2.
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
Some Terms Y =  o +  1 X Regression of Y on X Regress Y on X X called independent variable or predictor variable or covariate or factor Which factors.
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Chapter 12 Simple Regression
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 4. Further Issues.
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 4. Further Issues.
Chapter 11 Multiple Regression.
C ENTERING IN HLM. W HY CENTERING ? In OLS regression, we mostly focus on the slope but not intercept. Therefore, raw data (natural X metric) is perfectly.
Linear Regression Example Data
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Multiple Regression Research Methods and Statistics.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.
Introduction to Multilevel Modeling Using SPSS
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Introduction Multilevel Analysis
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
Multiple Regression The Basics. Multiple Regression (MR) Predicting one DV from a set of predictors, the DV should be interval/ratio or at least assumed.
Regression. Population Covariance and Correlation.
Introduction to Multilevel Modeling Stephen R. Porter Associate Professor Dept. of Educational Leadership and Policy Studies Iowa State University Lagomarcino.
1 G Lect 7M Statistical power for regression Statistical interaction G Multiple Regression Week 7 (Monday)
Chapter 7 Relationships Among Variables What Correlational Research Investigates Understanding the Nature of Correlation Positive Correlation Negative.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
Department of Cognitive Science Michael J. Kalsher Adv. Experimental Methods & Statistics PSYC 4310 / COGS 6310 Regression 1 PSYC 4310/6310 Advanced Experimental.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
HLM Models. General Analysis Strategy Baseline Model - No Predictors Model 1- Level 1 Predictors Model 2 – Level 2 Predictors of Group Mean Model 3 –
Analysis of Covariance (ANCOVA)
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
1 B IVARIATE AND MULTIPLE REGRESSION Estratto dal Cap. 8 di: “Statistics for Marketing and Consumer Research”, M. Mazzocchi, ed. SAGE, LEZIONI IN.
FIXED AND RANDOM EFFECTS IN HLM. Fixed effects produce constant impact on DV. Random effects produce variable impact on DV. F IXED VS RANDOM EFFECTS.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Jessaca Spybrook Western Michigan University Multi-level Modeling (MLM) Refresher.
Applied Quantitative Analysis and Practices LECTURE#28 By Dr. Osman Sadiq Paracha.
Biostatistics Regression and Correlation Methods Class #10 April 4, 2000.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Michael J. Kalsher PSYCHOMETRICS MGMT 6971 Regression 1 PSYC 4310 Advanced Experimental Methods and Statistics © 2014, Michael Kalsher.
Regression. Why Regression? Everything we’ve done in this class has been regression: When you have categorical IVs and continuous DVs, the ANOVA framework.
Multiple Regression.
The simple linear regression model and parameter estimation
Simple Linear Regression
Correlation and Regression
Multiple Regression.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Analysis of Covariance (ANCOVA)
26134 Business Statistics Week 6 Tutorial
Simple Linear Regression - Introduction
Multiple Regression.
Simple Linear Regression
Regression and Categorical Predictors
Chapter 14 Multiple Regression
Regression Part II.
Presentation transcript:

Questions From Yesterday Equation 2: r-to-z transform Equation is correct Comparable to other p-value estimates (z = r sqrt[n]) ANOVA will not be able to detect a group effect that has alternating + and – ICC Effect defined in terms of between and within group variability rather than being represented individually SPSS Advanced Models can be ordered at the VU Bookstore for $51

Hierarchical Linear Modeling (HLM) Theoretical introduction Introduction to HLM HLM equations HLM interpretation of your data sets Building an HLM model Demonstration of HLM software Personal experience with HLM tutorial

General Information and Terminology HLM can be used on data with many levels but we will only consider 2-level models The lowest level of analysis is Level 1 (L1), the second lowest is Level 2 (L2), and so on In group research, Level 1 corresponds to the individual level and Level 2 corresponds to the group level Your DV has to be at the lowest level

When Should You Use HLM? If you have mixed variables If you have different number of observations per group If you think a regression relationship varies by group Any time your data has multiple levels

What Does HLM Do? Fits a regression equation at the individual level Lets parameters of the regression equation vary by group membership Uses group-level variables to explain variation in the individual-level parameters Allows you to test for main effects and interactions within and between levels

The Level 1 Regression Equation Predicts the value of your DV from the values of your L1 IVs (example uses 2) Equation has the general form of Yij = B0j + B1j * X1ij + B2j * X2ij + rij “i” refers to the person number and “j” refers to the group number Since the coefficients B0, B1, and B2 change from group to group they have variability that we can try to explain

Level 2 Equations Predict the value of the L1 parameters using values of your L2 IVs (example uses 1) Sample equations: B0j = G00 + G01 * W1j + u0j B1j = G10 + G11 * W1j + u1j B2j = G20 + G21 * W1j + u2j You will have a separate equation for each parameter

Combined Model We can substitute the L2 equations into the L1 equation to see the combined model Yij = G00 + G01 * W1j + u0j + (G10 + G11 * W1j + u1j) X1ij + (G20 + G21 * W1j + u2j) X2ij + rij Cannot estimate this using normal regression HLM estimates the random factors from the model with MLE and the fixed factors with LSE

Centering L1 regression equation: Yij = B0j + B1j * X1ij + B2j * X2ij + rij B0j tells us the value of Yij when X1ij = 0 and X2ij = 0 Interpretation of B0j therefore depends on the scale of X1ij and X2ij “Centering” refers to subtracting a value from an X to make the 0 point meaningful

Centering (continued) If you center the Xs on their group mean (GPM) then B0 represents the group mean on Yij If you center the Xs on the grand mean (GRM) then B0 represents the group mean on Yij adjusted for the group’s average value on the Xs You can also center an X on a meaningful fixed value

Estimating the Model After you specify the L1 and L2 parameters you need to estimate your parameters We can examine the within and between group variability of L1 parameters to estimate the reliability of the analysis We examine estimates of L2 parameters to test theoretical factors

Interpreting Level 2 Intercept Parameters L2 intercept equation B0j = G00 + G01 * W1j + u0j G00 is the average intercept across groups If Xs are GPM centered, G01 is the relationship between W1 and the group mean (main effect of W1) If Xs are GRM centered, G01 is the relationship between W1 and the adjusted group mean u0 is the unaccounted group effect

Interpreting Level 2 Slope Parameters L2 slope equation B1j = G10 + G11 * W1j + u1j G10 is the average slope (main effect of X) G11 is relationship between W1 and the slope (interaction between X and W) u1 is the unaccounted group effect

Building a HLM Model Start by fitting a random coefficient model All L1 variables included L2 equations only have intercept and error Examine the L2 output for each parameter If there is no random effect then parameter does not vary by group If there is no random effect and no intercept then the parameter is not needed in the model

Building a HLM Model (continued) Build the full intercepts- and slopes-as-outcomes model Use L2 predictor variables to explain variability in parameters with group effects Remove L2 predictors from equations where they are unable to explain a significant amount of variability