Continuous Outcome, Dependent Variable (Y-Axis) Child’s Height

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Correlation and Linear Regression.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Simple Linear Regression. G. Baker, Department of Statistics University of South Carolina; Slide 2 Relationship Between Two Quantitative Variables If.
Linear regression models
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Introduction to Linear Regression.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r Assumptions.
X Y. Variance Covariance Correlation Scatter plot.
Statistics for the Social Sciences
Introduction to Linear and Logistic Regression. Basic Ideas Linear Transformation Finding the Regression Line Minimize sum of the quadratic residuals.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
1 Relationships We have examined how to measure relationships between two categorical variables (chi-square) one categorical variable and one measurement.
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Linear Regression.  Uses correlations  Predicts value of one variable from the value of another  ***computes UKNOWN outcomes from present, known outcomes.
Statistical hypothesis testing – Inferential statistics II. Testing for associations.
Correlational Research Strategy. Recall 5 basic Research Strategies Experimental Nonexperimental Quasi-experimental Correlational Descriptive.
Correlation and Regression
Relationship of two variables
Correlation and Covariance. Overview Continuous Categorical Histogram Scatter Boxplot Predictor Variable (X-Axis) Height Outcome, Dependent Variable (Y-Axis)
Correlation and Covariance
What factors are most responsible for height?
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Statistical Methods Statistical Methods Descriptive Inferential
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Regression. Population Covariance and Correlation.
Part IV Significantly Different Using Inferential Statistics Chapter 15 Using Linear Regression Predicting Who’ll Win the Super Bowl.
Part IV Significantly Different: Using Inferential Statistics
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Chapter 4 Prediction. Predictor and Criterion Variables  Predictor variable (X)  Criterion variable (Y)
Aim: Review for Exam Tomorrow. Independent VS. Dependent Variable Response Variables (DV) measures an outcome of a study Explanatory Variables (IV) explains.
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
STA291 Statistical Methods Lecture LINEar Association o r measures “closeness” of data to the “best” line. What line is that? And best in what terms.
CHAPTER 5 Regression BPS - 5TH ED.CHAPTER 5 1. PREDICTION VIA REGRESSION LINE NUMBER OF NEW BIRDS AND PERCENT RETURNING BPS - 5TH ED.CHAPTER 5 2.
Data Analysis.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
MARE 250 Dr. Jason Turner Linear Regression. Linear regression investigates and models the linear relationship between a response (Y) and predictor(s)
Essential Statistics Chapter 51 Least Squares Regression Line u Regression line equation: y = a + bx ^ –x is the value of the explanatory variable –“y-hat”
Basic Statistics Linear Regression. X Y Simple Linear Regression.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Engineers often: Regress data to a model  Used for assessing theory  Used for predicting  Empirical or theoretical model Use the regression of others.
What factors are most responsible for height?. Model Specification ERROR??? measurement error model error analysis unexplained unknown unaccounted for.
Outline Research Question: What determines height? Data Input Look at One Variable Compare Two Variables Children’s Height and Parents Height Children’s.
Linear Regression Chapter 7. Slide 2 What is Regression? A way of predicting the value of one variable from another. – It is a hypothetical model of the.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Steps Continuous Categorical Histogram Scatter Boxplot Child’s Height Linear Regression Dad’s Height Gender Continuous Y X1, X2 X3 Type Variable Mom’s.
Research Methods: 2 M.Sc. Physiotherapy/Podiatry/Pain Correlation and Regression.
REGRESSION G&W p
Regression Chapter 6 I Introduction to Regression
Kin 304 Regression Linear Regression Least Sum of Squares
Multiple Regression.
Linear Regression Prof. Andy Field.
Correlation and Regression Basics
Regression Analysis PhD Course.
AP Stats: 3.3 Least-Squares Regression Line
Linear Regression.
Merve denizci nazlıgül, M.s.
Ch 4.1 & 4.2 Two dimensions concept
Correlation and Covariance
Introduction to Regression
Chapter 14 Multiple Regression
Presentation transcript:

Continuous Outcome, Dependent Variable (Y-Axis) Child’s Height Histogram Continuous Scatter Predictor Variable (X-Axis) Parents Height Categorical Boxplot Gender Regression Model Linear Regression

Correlation

Correlation Matrix

Analytics & History: 1st Regression Line http://galton.org/cgi-bin/searchImages/search/pearson/vol3a/pages/vol3a_0019.htm The first “Regression Line”

Describing a Straight Line Regression coefficient for the predictor Gradient (slope) of the regression line Direction/strength of relationship b0 Intercept (value of Y when X = 0) Point at which the regression line crosses the Y- axis (ordinate) Slide 5

Which line fits the best?

Sum of Squares Total sum of squares Model sum of squares Residual sum of squares F R2

Sum of Squares SST SSR SSM Total variability (variability between scores and the mean). SSR Residual/error variability (variability between the regression model and the actual data). SSM Model variability (difference in variability between the model and the mean). Slide 8

Testing the Model: ANOVA SST Total Variance in the Data SSM Improvement Due to the Model SSR Error in Model If the model results in better prediction than using the mean, then we expect SSM to be much greater than SSR

Linear Model - Regression lm() function – lm stands for ‘linear model’. Model <-lm(outcome ~ predictor(s), data = dataFrame, na.action = an action)) model.1 <- lm(childHeight~father, data = heights)

Correlation

Model 1

Testing the Model: R2 R2 The proportion of variance accounted for by the regression model. The Pearson Correlation Coefficient Squared Slide 15

Residuals

Prediction predict(model.1) heights$model1 <- predict(model.1)

Compare Models 0.385 Model 1 2 12 3 4 Intercept 40.1 46.6 22.6 22.63 22.64 Father 0.385 0.36 0.01 Mom 0.314 0.29 NA midparentHeight 0.637 0.538 R-squares 0.070 0.0395 0.105 0.102 0.1033 r 0.27 0.2 0.32 R^2 0.073 0.04

Box Plot http://web.anglia.ac.uk/numbers/graphsCharts.html

Descriptive Stats: Box Plot

Regression: Children Heights~Gender model.5 <- lm(childHeight~gender, data = h)

Linear Regression Comparison Model 1 2 12 3 4 5 6 7 Intercept 40.1 46.6 22.6 64.1 16.5 Father 0.385 0.36 x 0.39 Mom 0.314 0.29 0.31 midparentHeight 0.637 0.538 0.687 Gender 5.13 5.21 R-squares 0.070 0.0395 0.105 0.102 0.1033 0.5137 0.632 0.634 r 0.27 0.2 0.32 0.717 R^2 0.073 0.04

Model Specification & Prediction Outcome = (Model) + Error Height = 16.5 + 0.39*father + 0.21mother + 5.21Gender + error Gender: Male: 1 Female: 0