CHAPTER 13 ANOVA.

Slides:



Advertisements
Similar presentations
Chapter 16: Correlation.
Advertisements

Lesson 10: Linear Regression and Correlation
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Correlation and Linear Regression.
Review ? ? ? I am examining differences in the mean between groups
Bivariate Analyses.
Describing Relationships Using Correlation and Regression
Correlation & Regression Chapter 15. Correlation statistical technique that is used to measure and describe a relationship between two variables (X and.
Chapter 6: Correlational Research Examine whether variables are related to one another (whether they vary together). Correlation coefficient: statistic.
Correlation CJ 526 Statistical Analysis in Criminal Justice.
Correlation Chapter 9.
Lecture 10 PY 427 Statistics 1 Fall 2006 Kin Ching Kong, Ph.D
Lecture 11 PY 427 Statistics 1 Fall 2006 Kin Ching Kong, Ph.D
SIMPLE LINEAR REGRESSION
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 6: Correlation.
REGRESSION AND CORRELATION
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Correlational Designs
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Chapter 9 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 What is a Perfect Positive Linear Correlation? –It occurs when everyone has the.
Relationships Among Variables
Smith/Davis (c) 2005 Prentice Hall Chapter Eight Correlation and Prediction PowerPoint Presentation created by Dr. Susan R. Burns Morningside College.
Correlation and Linear Regression
Chapter 8: Bivariate Regression and Correlation
Lecture 16 Correlation and Coefficient of Correlation
in Quantitative Research II E.Shehniyilagh Ph.D
Chapter 13: Inference in Regression
Correlation and Linear Regression
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
Chapter 15 Correlation and Regression
Chapter 11 HYPOTHESIS TESTING USING THE ONE-WAY ANALYSIS OF VARIANCE.
Statistical Evaluation of Data
Chapter 12 Examining Relationships in Quantitative Research Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
L 1 Chapter 12 Correlational Designs EDUC 640 Dr. William M. Bauer.
Correlations. Outline What is a correlation? What is a correlation? What is a scatterplot? What is a scatterplot? What type of information is provided.
UNDERSTANDING RESEARCH RESULTS: DESCRIPTION AND CORRELATION © 2012 The McGraw-Hill Companies, Inc.
Basic Statistics Correlation Var Relationships Associations.
Figure 15-3 (p. 512) Examples of positive and negative relationships. (a) Beer sales are positively related to temperature. (b) Coffee sales are negatively.
Examining Relationships in Quantitative Research
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 12 Correlational Designs.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
Correlation & Regression Chapter 15. Correlation It is a statistical technique that is used to measure and describe a relationship between two variables.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 16 Data Analysis: Testing for Associations.
Describing Relationships Using Correlations. 2 More Statistical Notation Correlational analysis requires scores from two variables. X stands for the scores.
CORRELATION. Correlation key concepts: Types of correlation Methods of studying correlation a) Scatter diagram b) Karl pearson’s coefficient of correlation.
Examining Relationships in Quantitative Research
Correlations. Distinguishing Characteristics of Correlation Correlational procedures involve one sample containing all pairs of X and Y scores Correlational.
Chapter 12 Introduction to Analysis of Variance PowerPoint Lecture Slides Essentials of Statistics for the Behavioral Sciences Eighth Edition by Frederick.
Chapter 14 Correlation and Regression
Correlation They go together like salt and pepper… like oil and vinegar… like bread and butter… etc.
Chapter 16: Correlation. So far… We’ve focused on hypothesis testing Is the relationship we observe between x and y in our sample true generally (i.e.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Chapter 15: Correlation. Correlations: Measuring and Describing Relationships A correlation is a statistical method used to measure and describe the relationship.
Chapter 8 Relationships Among Variables. Outline What correlational research investigates Understanding the nature of correlation What the coefficient.
Chapter 13 Understanding research results: statistical inference.
Chapter 13 Linear Regression and Correlation. Our Objectives  Draw a scatter diagram.  Understand and interpret the terms dependent and independent.
Chapter 13 Simple Linear Regression
Chapter 12 Understanding Research Results: Description and Correlation
Chapter 15: Correlation.
Understanding Research Results: Description and Correlation
Correlation and Regression
An Introduction to Correlational Research
15.1 The Role of Statistics in the Research Process
Chapter 15 Correlation Copyright © 2017 Cengage Learning. All Rights Reserved.
Review I am examining differences in the mean between groups How many independent variables? OneMore than one How many groups? Two More than two ?? ?
MGS 3100 Business Analysis Regression Feb 18, 2016
Presentation transcript:

CHAPTER 13 ANOVA

Relationship of Statistical Tests Does this Diagram Make Sense to You?

ANALYSIS OF VARIANCE ANOVA TESTS FOR DIFFERENCES AMONG TWO OR MORE POPULATION MEANS σ²=S²=MS=Variance MS=Mean of Squared Deviation Ex of ANOVA Research: The effect of temperature on recall.

Statistics Standard Deviations and Variances X σ² = ss/N Pop 1 σ = √ss/N 2 4 s = √ss/df Sample 5 s² = ss/n-1 or ss/df MS = SS/df

Effects of Temperature (IV) on Recall (DV) One IV, 3 levels/Factors (50, 70, 90) and one DV

FACTORIAL ANOVA

MS bet = SS bet / df bet MS with = SS with /df with

ANOVA SS bet =Σ(T²/n-G²/N) SS with =Σss SS total =SS bet + SS with df bet = K-1 df with =N-K df total= df bet + df with

Post Hoc Tests (Post Tests) Post Hoc Tests are additional hypothesis tests that are done after an ANOVA to determine exactly which mean difference are significant and which are not.

Post Hoc Tests (Post Tests) We use the PHT when we 1.reject the Null 2. there are 3 or more groups. 1.Tukeys, 2. Scheffe, Bonferroni, Duncan, LSD etc. Tukey’s Honestly Significant Difference(HSD) Test q=Studentized Range Statistic HSD= q

Coefficient of Determination If r = 0.80 then, r² = 0.64 This means 64% of the variability in the Y scores can be predicted from the relationship with X. r=√r² or √r²=r

Measuring Effect Size for Anova by Coefficient of Determination r²=η² eta squared

Problems The data in next slide were obtained from an independent-measures experiment designed to examine people’s performances for viewing distance of a 60-inch high definition television. Four viewing distances were evaluated, 9 feet, 12 feet, 15 feet, and 18 feet, with a separate group of participants tested at each distance. Each individual watched a 30-minute television program from a specific distance and then completed a brief questionnaire measuring their satisfaction with the experience.

Problems One question asked them to rate the viewing distance on a scale from 1 (Very Bad definitely need to move closer or farther away) to 7 (Excellent-perfect viewing distance). The purpose of the ANOVA is to determine whether there are any significant differences among the four viewing distances that were tested. Before we begin the hypothesis test, note that we have already computed several summary statistics for the data in next slide. Specifically, the tretment totals (T) and SS values are shown for the entire set of data.

Problems 9 feet 12 feet 15 fet 18 feet 3 4 7 6 N=20 0 3 6 3 G=60 2 1 5 4 ΣX² =262 0 1 4 3 K=4 0 1 3 4 T1=5 T2=10 T3=25 T4=20 SS1=8 SS2= 8 SS3=10 SS4=6 M1=1 M2=2 M3=5 M4=4 n1=5 n2=5 n3=5 n4=5

H0 : µ1= µ2=µ3=µ4 (There is no treatment effect.) Problems Having these summary values simplifies the computations in the hypothesis test, and we suggest that you always compute these summary statistics before you begin an ANOVA. We will set alpha at α =.05 Step 1) H0 : µ1= µ2=µ3=µ4 (There is no treatment effect.) H1 : (At least one of the treatment means is different.)

Step 2

Problems A human factor psychologist studied three computer keyboard designs. Three samples of individuals were given material to type on a particular keyboard, and the number of errors committed by each participant was recorded. the data are on next slide. Set alpha at α =.01

Problems Keyboard A Keyboard B Keyboard C 0 6 6 N=15 4 8 5 G=60 0 5 9 ΣX² =356 1 4 4 0 2 6 T1=5 T2=25 T3=30 SS1=12 SS2=20 SS3=14 M1=1 M2=5 M3=6 Is there a significant differences among the three computer keyboard designs ?

H0 : µ1= µ2=µ3 (No differences between the computer keyboard designs ) Problems Step 1) H0 : µ1= µ2=µ3 (No differences between the computer keyboard designs ) H1 : (At least one of the computer keyboard designs is different.)

Step 2

CHAPTER 14

Chapter 15 Correlation & Regression

What is Correlation??? Correlation measures the strength and the direction of the relationship between two or more variables. A correlation has three components: 1.The strength of the coefficient 2.The direction of the relationship 3.The form of the relationship 1.The strength of the coefficient is indicated by the absolute value of the coefficient. The closer the value is to 1.0, either positive or negative, the stronger or more linear the relationship. The closer the value is to 0, the weaker or nonlinear the relationship.

Correlation 2. The direction of coefficient is indicated by the sign of the correlation coefficient. A positive coefficient indicates that as one variable (X) increases, so does the other (Y). A negative coefficient indicates that as one variable (X) increases, the other variable (Y) decreases. 3. The form of the relationship The form of the relationship is linear. In correlation variables are not identified as independent or dependent because the researcher is measuring the one relationship that is mutually shared between the two variables As a result, causality should not be implied with correlation.

Correlation Remember, the correlation coefficient can only measure a linear relationship. A zero correlation indicates no linear relationship. However, does not indicate no relationship. a coefficient of zero rules out linear relationship, but a curvilinear could still exist. The scatterplots below illustrate this point:

The Correlation is based on a Statistic Called Covariance Variance and Covariance are used to measure the quality of an item in a test. Reliability and validity measure the quality of the entire test. σ²=SS/N  used for one set of data Variance is the degree of variability of scores from mean.

The Correlational Method SS, Standard Deviations and Variances X σ² = ss/N Pop 1 σ = √ss/N 2 4 s = √ss/df 5 s² = ss/n-1 or ss/df Sample SS=Σx²-(Σx)²/N SS=Σ( x-μ)² Sum of Squared Deviation from Mean

Variance X σ² = ss/N Pop 1 s² = ss/n-1 or ss/df Sample 2 4 5 SS=Σx²-(Σx)²/N SS=Σ( x-μ)² Sum of Squared Deviation from Mean

Covariance Correlation is based on a statistic called Covariance (Cov xy or S xy) = SP/N-1 r=sp/√ssx.ssy - Correlation Covariance is a number that reflects the degree to which 2 variables vary together. Original Data X Y 1 3 2 6 4 4 5 7

2 ways to calculate the SP SP= Σxy-(Σx.Σy)/N Computation Covariance COVxy=SP/N-1 2 ways to calculate the SP SP= Σxy-(Σx.Σy)/N Computation SP= Σ(x-μx)(y-μy) Definition SP requires 2 sets of data SS requires only one set of data

The Correlational Method Correlation is the degree to which events or characteristics vary from each other Measures the strength of a relationship Does not imply cause and effect The people chosen for a study are its subjects or participants, collectively called a sample The sample must be representative

The Correlational Method Correlational data can be graphed and a “line of best fit” can be drawn. 1- Pearson Correlations 2-Spearman 3-Point-Biserial Correlation 4- Partial Correlation

Types of Correlation In correlational research we use continues variables (interval or ratio scale) for 1. Pearson Correlation (for linear relationship). If it is difficult to measure a variable on an interval or ratio scale then we use

Types of Correlation 2.Spearman Correlation Spearman Correlation uses ordinal or rank ordered data Spearman Correlation measures the consistency of a relationship (Monotonic Relationship). Ex. A teacher may feel confident about rank ordering students’ leadership abilities but would find it difficult to measure leadership on some other scale.

Monotonic Transformation They are rank ordered numbers (DATA), and use ordinal scale(data) examples; 1, 2, 3, 4, or 2, 4, 6, 8. Spearman Correlation can be used to measure the degree of Monotonic relationship between two variables.

Ex. of Monotonic data X Y 22 87 25 102 19 10 6 5

Types of Correlation 3.The Point-Biserial Correlation. However, we can use both continues and discrete variables(data) in The Point-Biserial Correlation. (can be a substitute for two independent t-test)

3. The Point-Biserial Correlation The point-biserial correlation is used to measure the relationship between two variables in situations in which one variable consist of regular, numerical scores (non-dichotomies), but the second variable has only two values (dichotomies). We can also calculate this correlation from t-test r² = Coefficient of Determination which measures the effect size=d r² = t²/t²+df r = √r²

4. A Partial Correlation In special situations we can use Partial Correlations. Measures the relationship between two variables while controlling the influence of a third variable by holding it constant. Ex. The correlation between churches and crime. (third variable is population)

The Correlational Method Correlational data can be graphed and a “line of best fit” can be drawn

Positive correlation: variables change in the same direction

Positive Correlation

Negative correlation: variables change in the opposite direction

Negative Correlation

Unrelated: no consistent relationship No Correlation Unrelated: no consistent relationship

No Correlation

The Correlational Method The magnitude (strength) of a correlation is also important High magnitude = variables which vary closely together; fall close to the line of best fit Low magnitude = variables which do not vary as closely together; loosely scattered around the line of best fit

The Correlational Method Direction and magnitude of a correlation are often calculated statistically Called the “correlation coefficient,” symbolized by the letter “r” Sign (+ or -) indicates direction Number (from 0.00 to 1.00) indicates magnitude 0.00 = no consistent relationship +1.00 = perfect positive correlation -1.00 = perfect negative correlation Most correlations found in psychological research fall far short of “perfect”

The Correlational Method Correlations can be trusted based on statistical probability “Statistical significance” means that the finding is unlikely to have occurred by chance By convention, if there is less than a 5% probability that findings are due to chance (p < 0.05), results are considered “significant” and thought to reflect the larger population Generally, confidence increases with the size of the sample and the magnitude of the correlation

The Correlational Method Advantages of correlational studies: Have high external validity Can generalize findings Can repeat (replicate) studies on other samples Difficulties with correlational studies: Lack internal validity Results describe but do not explain a relationship

External & Internal Validity External Validity External validity addresses the ability to generalize your study to other people and other situations. Internal Validity Internal validity addresses the "true" causes of the outcomes that you observed in your study. Strong internal validity means that you not only have reliable measures of your independent (predictors) and dependent variables (criterions) BUT a strong justification that causally links your independent variables (IV) to your dependent variables (DV).

The Correlational Method Pearson r=sp/√ssx.ssy Original Data X Y 1 3 2 6 4 4 5 7 SP requires 2 sets of data SS requires only one set of data df=n-2

The Correlational Method Spearman r=sp/√ssx.ssy Original Data  Ranks X Y X Y 1 3 1 1 2 6 2 3 4 4 3 2 5 7 4 4 SP requires 2 sets of data SS requires only one set of data

Percentage of Variance Accounted for by the Treatment (similar to Cohen’s d) is known as ω² Omega Squared also is called Coefficient of Determination next page

Coefficient of Determination If r = 0.80 then, r² = 0.64 This means 64% of the variability in the Y scores can be predicted from the relationship with X. r=√r² or √r²=r

Problems Test the hypothesis for the following n=4 pairs of scores for a correlation, α =.01 r=sp/√ssx.ssy Original Data X Y 1 3 2 6 4 4 5 7

Step 1) Problems H0 : ρ=0 (There is no population correlation.) H1 : ρ≠0 (There is a real correlation.) Ρ: probability or chances … We will set alpha at α =.01

STEP 2

Problems Test the hypothesis for the following set of n=5 pairs of scores for a positive correlation, α =.05 Original Data X Y 0 2 10 6 4 2 8 4 8 6

Problems Step 1) H0 : ρ≤0 ((The population correlation is not positive.) H1 : ρ>0 (The population correlation is positive.) Ρ: probability or chances are… We will set alpha at α =.05

Three Levels of Analysis for Prediction/validity INPUTS PROCESSES OUTCOMES Ex. Stress (INPUT) is an unpleasant psychological (PROCESS) that occurs in response to environmental pressures e.g., job demands) and can lead to withdrawal (OUTCOME).

Bi-Variate Regression Analysis Bi-variate regression analysis extends correlation and attempts to measure the extent to which a predictor variable (X) can be used to make a prediction about a criterion measure (Y). E Bi-variate regression uses a linear model to predict the criterion measure. The formula for the predicted score is: Y' = a + bX

Bivariate Regression The components of the line of best fit (Y' = a + bX) are: the Y-intercept or (a) is Constant the slope (b) Variable (X)

Bivariate Regression The Y-intercept is the average value of Y when X is zero. The Y-intercept is also called constant. Because, this is the amount of Y that is constant or present when the influence of X is null (0). The slope is average value of a one unit change in Y for a corresponding one unit change in X. Thus, the slope represents the direction and intensity of the line.

Regression and Prediction Y=bX+a Regression Line e

Bivariate Regression Line of Best Fit: Y' = 2.635 + .204X With this equation a predicted score may be made for any value of X within the range of data. a=2.635 and b=.204 Y-intercept Slope = .204 2.635

Multiple Regression Analysis Multiple regression analysis is an extension of bi-variate regression, in which several predictor variables are used to predict one criterion measure (Y). In general, this method is considered to be advantageous; since seldom can an outcome measure be accurately explained by one predictor variable. Ex. 3 aspects of personality (OCPD, Narcissistic, Histrionic) and Depression E Y' = a + b1X1 +b2X2 +b3X3

Path Analysis Path Analysis is an extension of regression. In Path analysis the researcher is examining the ability of more than one predictor variable (X) to explain or predict multiple dependent variables (Y). 2 aspects of personality (OCPD, Narcissistic) and (Depression and Anxiety) X 1 Y Y 1 2 X 2 E E

Relationship of Statistical Tests Does this diagram make sense to you?