1 G89.2228 Lect 10a G89.2228 Lecture 10a Revisited Example: Okazaki’s inferences from a survey Inferences on correlation Correlation: Power and effect.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Regression and correlation methods
Forecasting Using the Simple Linear Regression Model and Correlation
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Ch11 Curve Fitting Dr. Deshi Ye
Sampling Distributions (§ )
Objectives (BPS chapter 24)
Simple Linear Regression
Introduction to Regression Analysis
Chapter 10 Simple Regression.
The Simple Regression Model
SIMPLE LINEAR REGRESSION
Correlation. Two variables: Which test? X Y Contingency analysis t-test Logistic regression Correlation Regression.
Chapter Topics Types of Regression Models
Linear Regression and Correlation Analysis
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Chapter 11 Multiple Regression.
Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections ): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence.
Simple Linear Regression Analysis
SIMPLE LINEAR REGRESSION
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
5-3 Inference on the Means of Two Populations, Variances Unknown
Summary of Quantitative Analysis Neuman and Robson Ch. 11
Simple Linear Regression and Correlation
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Correlation & Regression
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Correlation and Linear Regression
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Simple Linear Regression Models
Bivariate Regression (Part 1) Chapter1212 Visual Displays and Correlation Analysis Bivariate Regression Regression Terminology Ordinary Least Squares Formulas.
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
Elementary Statistics Correlation and Regression.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
I231B QUANTITATIVE METHODS ANOVA continued and Intro to Regression.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
1 G Lect 7a G Lecture 7a Comparing proportions from independent samples Analysis of matched samples Small samples and 2  2 Tables Strength.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
1 G Lect 3M Regression line review Estimating regression coefficients from moments Marginal variance Two predictors: Example 1 Multiple regression.
Lesson Testing the Significance of the Least Squares Regression Model.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Inference about the slope parameter and correlation
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Virtual COMSATS Inferential Statistics Lecture-26
Chapter 11: Simple Linear Regression
Correlation and Regression
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
6-1 Introduction To Empirical Models
SIMPLE LINEAR REGRESSION
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Chapter 14 Inference for Regression
Sampling Distributions (§ )
Presentation transcript:

1 G Lect 10a G Lecture 10a Revisited Example: Okazaki’s inferences from a survey Inferences on correlation Correlation: Power and effect size Regression: Expected Y given X Inference on regression Return to example

2 G Lect 10a Example: Okazaki’s Inferences from a survey Does self-construal account for relation of adverse functioning with Asian status? Survey of 348 students Self-reported Interdependence was correlated.53 with self-reported Fear of Negative Evaluation Illustrative plot (simulated) of r=.53

3 G Lect 10a Review of Correlation Definitions In a population with variables X and Y, If we have a sample from the population, we can calculate the product moment estimate: To estimate the population value, the (X,Y) pairs should be representative The sampling distribution of r XY is not simple. The standard error of r actually depends on knowing .

4 G Lect 10a Inferences on correlation Testing H 0 :  = 0 when either X or Y are normally distributed – A statistic that can be justified from a regression approach is –We usually do not compute a standard error for r, because it depends on  itself. For other inferences on one or more correlations, we use Fisher’s so-called z transformation: The standard error of is Howell shows how CI, and comparisons of correlations from independent samples can be computed using.

5 G Lect 10a Example: Okazaki’s correlation Test of H 0 :  =0 r=.53 and N=348 The null hypothesis is rejected. Confidence Interval for  Compute Compute confidence interval Transform back using Note that the resulting confidence interval is asymmetric

6 G Lect 10a Correlation: power and effect size Cohen’s rule of thumb for correlation effect sizes (both d above and differences in Fisher’s z transformation) is: small =.1 medium =.3 large =.5 Example (Okazaki, continued): N=348 gives 97% power to detect  =.20 with a two tailed test,  =.05. If  =.10, this N would only give 47% power. Power and Precision program and Howell’s approximate method give similar results

7 G Lect 10a Regression: Expected Y given X When Y and X are correlated, then the expected value of Y varies with X. E(Y|X) is not constant for different choices of X. We could chop up the plot of Y and X and compute separate means of Y for different value ranges of X Often this set of Conditional Expectations of Y given X can be described by a linear model Instead of estimating many means of Y|X, we estimate a * and b *, the y-intercept and the slope of the line.

8 G Lect 10a Regression coefficients as parameters If Y and X are known to have a bivariate normal distribution, then the relation between these is known to be linear. The conditional distribution of Y given X is expressed with parameters a * and b *. a * and b * may also derive meaning from structural models: Y is assumed to be caused by X. This assumption can not be tested, but the strength of the causal path under the model can be assessed. In some cases, we do not assume that a * and b * have any deep meaning, or that the true relation between Y and X is exactly linear. Instead, linear regression is used as an approximate predictive model.

9 G Lect 10a Estimating regression statistics b * and a * can be estimated using ordinary least squares methods. The resulting estimates are: They minimize the sum of squared residuals,, where is the predicted value of If The slope of Y regressed on X is not generally the same as the slope of X regressed on Y. The constant a * is the expected value of Y when X=0.

10 G Lect 10a Inference on regression The regression model is: where S YX is the standard deviation of the residuals The estimates, a and b will have normal distributions because of the central limit theorem. The standard error of b is based on N-2 degrees of freedom.

11 G Lect 10a Inference on regression (continued) To test H 0 : b=0, construct a t-test: t=b/s b, on N-2 degrees of freedom. To construct a 95% CI around the regression parameter, compute The t-test will be identical to that for correlation. The CI will be about b*, not , and hence won’t correspond to the one for correlation (calculated using Fisher’s z transformation).

12 G Lect 10a Okazaki: Predicting Fear of Negative from Interdependence From the data in her table 2, we compute –Mean of interdependence=4.49 –Var(interdependence)=.65, S X =.808 –Mean of FNE=38.52 –Var(FNE)=104.08, S Y = Compute b and a b=r YX (S Y /S X )=(.53)(10.2/.81)=6.69 Y = X + e Compute standard errors S b =S YX /[S X  (N-1)] =.575 Test statistic and CI t(N-2)=b/S b = 6.69/.575 = 11.6 CI: b±(1.96)(S b ) => (5.56,7.82)