Midterm Review Ch 7-8. Requests for Help by Chapter.

Slides:



Advertisements
Similar presentations
Simple Linear Regression and Correlation by Asst. Prof. Dr. Min Aung.
Advertisements

Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Describing Relationships Using Correlation and Regression
Correlation & Regression Chapter 15. Correlation statistical technique that is used to measure and describe a relationship between two variables (X and.
Chapter 15 (Ch. 13 in 2nd Can.) Association Between Variables Measured at the Interval-Ratio Level: Bivariate Correlation and Regression.
CORRELATON & REGRESSION
The Regression Equation Using the regression equation to individualize prediction and move beyond saying that everyone is equal, that everyone should score.
PSY 307 – Statistics for the Behavioral Sciences
Chapter 8 – Regression 2 Basic review, estimating the standard error of the estimate and short cut problems and solutions.
Chapter 7 -Part 1 Correlation. Correlation Topics zCo-relationship between two variables. zLinear vs Curvilinear relationships zPositive vs Negative relationships.
Lecture 11 PY 427 Statistics 1 Fall 2006 Kin Ching Kong, Ph.D
1 Chapter 8 – Regression 2 Basic review, estimating the standard error of the estimate and short cut problems and solutions.
Correlation 2 Computations, and the best fitting line.
The Simple Regression Model
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
Chapter 5 Introduction to Inferential Statistics.
REGRESSION AND CORRELATION
Introduction to Probability and Statistics Linear Regression and Correlation.
Regression Chapter 10 Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania.
SIMPLE LINEAR REGRESSION
Chapter 9 - Lecture 2 Computing the analysis of variance for simple experiments (single factor, unrelated groups experiments).
BCOR 1020 Business Statistics Lecture 24 – April 17, 2008.
1 Chapter 8 – Regression 2 Basic review, estimating the standard error of the estimate and short cut problems and solutions.
Correlation and Regression Analysis
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Relationships Among Variables
Statistical Analysis. Purpose of Statistical Analysis Determines whether the results found in an experiment are meaningful. Answers the question: –Does.
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
Example of Simple and Multiple Regression
Chapter 12 Correlation and Regression Part III: Additional Hypothesis Tests Renee R. Ha, Ph.D. James C. Ha, Ph.D Integrative Statistics for the Social.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
Statistical Analysis Statistical Analysis
Section #6 November 13 th 2009 Regression. First, Review Scatter Plots A scatter plot (x, y) x y A scatter plot is a graph of the ordered pairs (x, y)
CORRELATION & REGRESSION
Correlation and Regression
Chapter 15 Correlation and Regression
Inferential Statistics 2 Maarten Buis January 11, 2006.
Regression Analysis. Scatter plots Regression analysis requires interval and ratio-level data. To see if your data fits the models of regression, it is.
© 2014 by Pearson Higher Education, Inc Upper Saddle River, New Jersey All Rights Reserved HLTH 300 Biostatistics for Public Health Practice, Raul.
Hypothesis of Association: Correlation
Introduction to Linear Regression
Production Planning and Control. A correlation is a relationship between two variables. The data can be represented by the ordered pairs (x, y) where.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Elementary Statistics Correlation and Regression.
Statistical analysis Outline that error bars are a graphical representation of the variability of data. The knowledge that any individual measurement.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
Correlation & Regression Chapter 15. Correlation It is a statistical technique that is used to measure and describe a relationship between two variables.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Environmental Modeling Basic Testing Methods - Statistics III.
Chapter 14 Correlation and Regression
Correlation & Regression Analysis
Introducing Communication Research 2e © 2014 SAGE Publications Chapter Seven Generalizing From Research Results: Inferential Statistics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Chapter 7 Calculation of Pearson Coefficient of Correlation, r and testing its significance.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
CORRELATION ANALYSIS.
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
Regression Analysis.
Computations, and the best fitting line.
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Product moment correlation
SIMPLE LINEAR REGRESSION
Reasoning in Psychology Using Statistics
Reasoning in Psychology Using Statistics
MGS 3100 Business Analysis Regression Feb 18, 2016
Presentation transcript:

Midterm Review Ch 7-8

Requests for Help by Chapter

Chapter 7 zDescribe the characteristics of the relationship between two variables. zDiscuss the null and research hypotheses for correlation. zDiscuss using the r table for determining significance.

Chapter 8 zDiscuss the steps in making raw data predictions from raw data values. zDiscuss the situations when you cannot use regression. zDiscuss the inappropriateness of predicting outside of the sample range. zDiscuss the null hypothesis in regression. zDiscuss alpha levels and critical values with respect to statistical significance. zDiscuss residual error when predicting from regression.

Describe the characteristics of the relationship between two variables.

Describe the characteristics of the relationship between two variables zThree dimensions characterize the relationship between two variables, ylinearity, ydirection, yand strength.

Linearity zThe relationship is either linear or some other curvilinear relationship. zIn a linear relationship, as scores on one variable increase, scores on the other variable either generally increase or generally decrease. zIn a curvilinear relationship, as scores on one variable increase, scores on the other variable move first in one direction, then in another direction.

Direction zThe direction of a relationship is either positive or negative. zIn a positive relationship, as scores on one variable increase, scores on the other variable increase. So a best fitting line rises from left to right on a graph and therefore has a positive slope. zIn a negative relationship, as scores on one variable increase, scores on the other variable decrease. So a best fitting line falls from left to right on a graph and therefore has a negative slope.

Strength zThe strength of a correlation indicates how predictable one variable is from another. zIn a strong relationship, t X and t Y scores are consistently similar or dissimilar. So, you are able to accurately predict one score from another. zIn a weak relationship, t X and t Y scores are in consistent in similarity or dissimilarity. So, you are only able to somewhat predict one score from another. zIn an independent relationship, there is no consistency in the relationship of the tX and tY scores So, it is impossible to predict one score from another.

Discuss alpha levels and critical values with respect to statistical significance. Discuss the null and research hypotheses for correlation. Discuss the null hypothesis in regression. Discuss using the r table for determining significance.

Alpha levels and significance zScientists are a careful bunch. They are very careful to not make a Type 1 error. zA Type 1 error is when you mistakenly say that you have found a relationship in a population, when the one you found in your random sample doesn’t exist in the population as a whole zTo be careful, scientists will only say there is a relationship, if the probability of a Type 1 error is very low (5 in 100 or lower.)

... Alpha levels and significance zThese probabilities are called alpha levels. zThe typical alpha levels are p .05 or p .01. zThese represent 5 out of 100 or 1 out of 100. zA sample r that is far enough from to occur 5 or fewer times in 100 when rho actually equals zero is called significant.

The Null Hypothesis zThe null hypothesis (H 0 ) states that a non-zero correlation in a sample between two variables is the result of random sampling fluctuation. zTherefore, there is no underlying relationship in the population as a whole. zIn mathematical terms, rho =

The Alternative Hypothesis zThe alternative hypothesis is the opposite of the null hypothesis. zIt states that there is an underlying relationship in the population as a whole and that it is reflected by a non-zero correlation in your random sample.

Rejecting the Null Hypothesis zThe purpose of research is to reject the null hypothesis. zWe reject the null hypothesis, when the correlation is significant. zThe correlation is significant, when the probability that the result is due to an error is less than the.05 or.01 alpha level.

Using the r Table to Determine Significance zFirst, calculate r. zThen, determine the degrees of freedom, (n p -2). zLook in the r table to see if r falls outside the CI.95 in Column 2 of the r table. If r does, it is significant.

to to to to to to to to to to to to to to to to to to to df nonsignificant Look in this column for the row that has your degrees of freedom. Does r fall here? Or here? Non-significant Significant

Significant Correlation zIf r is non-significant, we continue to accept the null hypothesis and say that rho = in the population. zIf r is significant, we reject the null hypothesis at the.05 or.01 alpha level. We assume that rho is best estimated by r, the correlation found in the random sample.

Discuss the situations when you cannot use regression. Discuss the inappropriateness of predicting outside of the sample range.

Use Regression Carefully zWhen we have the entire population and compute rho, yWe know all of the values of X and Y. yWe know the direction and strength of the relationship between X and Y variables. zTherefore, we can safely use the regression equation to predict Y from X. zEven when rho is exactly zero, the regression equation is still right. It tells us to predict that everyone will score at the mean of Y.

Samples and Regression zWhen we have a random sample from a population, we can only predict when yr is significant, otherwise we assume that rho is 0, ythe relationship between the variables is linear, otherwise it is inappropriate to use correlation at all, ythe X score is within the range of X scores in the sample, because for values outside of the range, you do not know if the linear relationship holds.

Discuss the steps in making raw data predictions from raw data values.

Describe the steps in making predictions from raw data. Scientists are interested in taking the score for one variable and then predicting the score for another variable. If you want to predict, you must first ensure that there is a linear relationship between the two variables. Then, you must calculate the correlation coefficient and check that it is significant. You also must check that the score you are predicting from is within the original range of scores. If these conditions are met, then you can use the regression equation to predict. You first convert the predicting score to a t score. Then you plug the t score and the correlation value into the regression equation. You solve the regression equation for the predicted t score. Finally, you convert the predicted t score into the predicted score.

Discuss residual error when predicting from regression.

zThe average squared error when we predict from the mean is the variance, also called the mean square error. zThe average squared error when we predict from the regression equation is called the residual mean square.

Residual Square Error zA significant correlation will always yield a better prediction than the mean. zTherefore, the residual mean square is always better, that is, smaller than the variance.

Steps in calculating Residual Square Error zTo calculate the variance take the deviations of Y from the mean of Y, square them, add them up, and divide by degrees of freedom. zTo calculate the residual mean square take the deviations of each Y from its predicted value, square them, add them up, and divide by degrees of freedom.

Short way to do that zr 2 equals the proportion of error that is gotten rid of when you use the regression equation rather than the mean as your prediction. zSo the amount of error you get rid of equals the original sum of squares for Y times r 2. zSo the remaining error, SS RESID, equals the amount of error you get by using the mean as your predictor (SS Y ) minus the amount you get rid of by using the regression equation, r 2 SS Y

SS RESID =SS Y –r 2 SS Y To get average squared error when using the regression equation, divide by df REG (MS RESID =SS RESID /(n p -2) The standard error of the estimate is simply the square root of MS RESID