Linear models OUTLINE: Predictions

Slides:



Advertisements
Similar presentations
Simple linear models Straight line is simplest case, but key is that parameters appear linearly in the model Needs estimates of the model parameters (slope.
Advertisements

Qualitative predictor variables
Topic 12: Multiple Linear Regression
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
BA 275 Quantitative Business Methods
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Linear regression models
July 1, 2008Lecture 17 - Regression Testing1 Testing Relationships between Variables Statistics Lecture 17.
1 Regression Analysis Modeling Relationships. 2 Regression Analysis Regression Analysis is a study of the relationship between a set of independent variables.
Psychology 202b Advanced Psychological Statistics, II February 1, 2011.
Psychology 202b Advanced Psychological Statistics, II February 10, 2011.
Linear Methods for Regression Dept. Computer Science & Engineering, Shanghai Jiao Tong University.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
GRA 6020 Multivariate Statistics The regression model OLS Regression Ulf H. Olsson Professor of Statistics.
CORRELATION AND SIMPLE LINEAR REGRESSION - Revisited Ref: Cohen, Cohen, West, & Aiken (2003), ch. 2.
NOTES ON MULTIPLE REGRESSION USING MATRICES  Multiple Regression Tony E. Smith ESE 502: Spatial Data Analysis  Matrix Formulation of Regression  Applications.
1 Econometrics 1 Lecture 6 Multiple Regression -tests.
Analysis of Individual Variables Descriptive – –Measures of Central Tendency Mean – Average score of distribution (1 st moment) Median – Middle score (50.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Structural Equation Modeling Intro to SEM Psy 524 Ainsworth.
Multilevel Modeling: Other Topics
Understanding Multivariate Research Berry & Sanders.
Chapter 12 Multiple Linear Regression Doing it with more variables! More is better. Chapter 12A.
Topic 14: Inference in Multiple Regression. Outline Review multiple linear regression Inference of regression coefficients –Application to book example.
Using Neural Networks to Predict Claim Duration in the Presence of Right Censoring and Covariates David Speights Senior Research Statistician HNC Insurance.
Xuhua Xia Slide 1 MANOVA All statistical methods we have learned so far have only one continuous DV and one or more IVs which may be continuous or categorical.
Chapter 6: Multiple Regression I Ayona Chatterjee Spring 2008 Math 4813/5813.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Regression. Population Covariance and Correlation.
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
2014. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR We need to be.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Education 793 Class Notes Multiple Regression 19 November 2003.
Multilevel Modeling: Other Topics David A. Kenny January 7, 2014.
G Lecture 81 Comparing Measurement Models across Groups Reducing Bias with Hybrid Models Setting the Scale of Latent Variables Thinking about Hybrid.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Engineers often: Regress data to a model  Used for assessing theory  Used for predicting  Empirical or theoretical model Use the regression of others.
Regression Analysis1. 2 INTRODUCTION TO EMPIRICAL MODELS LEAST SQUARES ESTIMATION OF THE PARAMETERS PROPERTIES OF THE LEAST SQUARES ESTIMATORS AND ESTIMATION.
Linear regression models. Purposes: To describe the linear relationship between two continuous variables, the response variable (y- axis) and a single.
Statistical Methods. 2 Concepts and Notations Sample unit – the basic landscape unit at which we wish to establish the presence/absence of the species.
Continuous Outcome, Dependent Variable (Y-Axis) Child’s Height
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
Lecturer: Ing. Martina Hanová, PhD. Business Modeling.
Canadian Bioinformatics Workshops
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Probability Theory and Parameter Estimation I
B&A ; and REGRESSION - ANCOVA B&A ; and
Regression Chapter 6 I Introduction to Regression
Business statistics and econometrics
Lecture 3: Linear Regression (with One Variable)
Linear Mixed Models in JMP Pro
Linear regression project
OUTLINE Lecture 5 A. Review of Lecture 4 B. Special SLR Models
Model Comparison: some basic concepts
Experimentation 101.
Modeling age-related development of delinquency during adolescence and early adulthood with an autoregressive growth curve Johannes A. Landsheer, Utrecht.
Simple Linear Regression
5.4 General Linear Least-Squares
Regression Statistics
Linear Regression and Correlation
Linear Regression and Correlation
Simple Linear Regression
Structural Equation Modeling
Presentation transcript:

Linear models OUTLINE: Predictions Contrasts (difference between predictions) Confidence intervals of predictions and contrast Matrix notation Exercises – Theory – Back to exercises

What are linear models? All linear models can be written on the form Growth rate Temperature All linear models can be written on the form Growth rate Temperature Males Females Growth rate Temperature Males Females Males Growth rate Temperature Non-linear models often have linear components – transformation of ”super parameters” have linear constraints (e.g. occupancy models). Females

Y X Some basic notation (xi,yi) Obs. (i) X Y 1 x1 y1 2 x2 y2 3 x3 y3 4 5 x5 y5 X

Exercises! Matrix multiplication:

All linear models can be written as Example – growth experiment of tomato plants: Response variable (Y): ’Height’ at the end of the experiment Predictor variable: ’Light’: 3 levels of light (low, medium, high) ’Soil’ type (A or B) All linear models can be written as Data: model.matrix( y ~ light + soil + light:soil ) Light Soil Height Low A y1 y2 B y3 y4 Med y5 y6 y7 y8 High y9 y10 y11 y12 intercept Light_med Light_high Soil_B Light_med & Soil_B 1

Medium light intensity Soil Height Low A y1 y2 B y3 y4 Med y5 y6 y7 y8 High y9 y10 y11 y12 intercept Light_med Light_high Soil_B Light_med & Soil_B 1 model.matrix( y ~ light + soil + light:soil ) Soil B Soil A Height (Y) Low light intensity Medium light intensity High light intensity

Medium light intensity Soil Height Low A y1 y2 B y3 y4 Med y5 y6 y7 y8 High y9 y10 y11 y12 intercept Light_med Light_high Soil_B Light_med & Soil_B 1 model.matrix( y ~ light + soil ) Soil B Soil A Height (Y) Low light intensity Medium light intensity High light intensity

Medium light intensity Soil Height Low A y1 y2 B y3 y4 Med y5 y6 y7 y8 High y9 y10 y11 y12 intercept Light_med Light_high Soil_B Light_med & Soil_B 1 model.matrix( y ~ light + soil ) Soil B Soil A Height (Y) Low light intensity Medium light intensity High light intensity

How to we get standard errors of the predictions? Light Soil Height Low A y1 y2 B y3 y4 Med y5 y6 y7 y8 High y9 y10 y11 y12 intercept Light_med Light_high Soil_B Light_med & Soil_B 1 X = model.matrix(y ~ light + soil + light:soil) beta = coef(lm(y ~ light + soil + light:soil)) Fitted predictions: Height (Y) Low light intensity Medium light intensity High light intensity How to we get standard errors of the predictions?

How to we get standard errors on the predictions? Two parameters: Three parameters: Any number of parameters: Approximation for non-linear functions (delta method): Variance-covariance matrix of fitted predictions: Replace the vector x with: Variances of fitted predictions: