Presentation is loading. Please wait.

Presentation is loading. Please wait.

Hypothesis testing Intermediate Food Security Analysis Training Rome, July 2010.

Similar presentations


Presentation on theme: "Hypothesis testing Intermediate Food Security Analysis Training Rome, July 2010."— Presentation transcript:

1 Hypothesis testing Intermediate Food Security Analysis Training Rome, July 2010

2 Hypothesis testing Hypothesis testing involves: 1. defining research questions and 2. assessing whether changes in an independent variable are associated with changes in the dependent variable by conducting a statistical test Dependent and independent variables  Dependent variables are the outcome variables  Independent variables are the predictive/ explanatory variables 2

3 Examples…  Research question: Is educational level of the mother related to birthweight?  What is the dependent and independent variable?  Research question: Is access to roads related to educational level of mothers?  Now? 3

4 Tests statistics  To test hypotheses, we rely on test statistics…  Test statistics are simply the result of a particular statistical test The most common include: 1. T-tests calculate T-statistics 2. ANOVAs calculate F-statistics 3. Correlations calculate the pearson correlation coefficient 4

5 Significant test statistic  Is the relationship observed by chance, or because there actually is a relationship between the variables???  This probability is referred to as a p-value and is expressed a decimal percent (ie. p=0.05)  If the probability of obtaining the value of our test statistic by chance is less than 5% then we generally accept the experimental hypothesis as true: there is an effect on the population  Ex: if p=0.1-- What does this mean? Do we accept the experimental hypothesis?  This probability is also referred to as significance level (sig.) 5

6 Hypothesis testing Part 1: Continuous variables

7 Topics to be covered in this presentation  T- test  One way analysis of variance (ANOVA)  Correlation By the end of this session, the participant should be able to: 7

8 Hypothesis testing… WFP tests a variety of hypothesis… Some of the most common include: 1. Looking at differences between groups of people (comparisons of means) Ex. Are different livelihood groups more likely to have different levels food consumption?? 2. Looking at the relationship between two variables… Ex. Is asset wealth associated with food consumption?? 8

9 How to assess differences in two means statistically T-tests 9

10 T -test A test using the t-statistic that establishes whether two means differ significantly. Independent means t-test:  It is used in situations in which there are two experimental conditions and different participants have been used in each condition. Dependent or paired means t-test:  This test is used when there are two experimental conditions and the same participants took part in both conditions of experiment. 10

11 Independent T-tests works well if:  continuous variables  groups to compare are composed of different people  within each group, variable’s values are normally distributed  there is the same level of homogeneity in the 2 groups. T-test: assumptions 11

12 Normal distribution Normal distributions are perfect symmetrical around the mean (mean is equal to zero) Values close to the mean (zero) have higher frequency. Values very far from the mean are less likely to occur (lower frequency)

13 Variance Variance measures how cases are similar on a specific variable (level of homogeneity) V = sum of all the squared distances from the Mean / N Variance is low → cases are very similar to the mean of the distribution (and to each other). The group of cases is therefore homogeneous (on this variable) Variance is high → cases tend to be very far from the mean (and different from each other). The group of cases is therefore heterogeneous (on this variable)

14 Homogeneity of Variance T-test works well if the two groups have the same homogeneity (variance) on the variable. If one group is very homogeneous and the another is not, T-test fails.

15 The independent t -test  The independent t-test compares two means, when those means have come from different groups of people; 15

16 T-tests formulas Quite simply, the T-test formula is a ratio of the: Difference between the two means or averages/ the variability or dispersion of the scores Statistically this formula is: 16

17 Example T-tests Difference in weight for age z-scores between males and females in Kenya T-test = T-test = 5.56 17

18 To conduct an independent t-test in SPSS 1. Click on “Analyze” drop down menu 2. Click on “Compare Means” 3. Click on “Independent- Sample T-Test…” 4. Move the independent and dependent variable into proper boxes 5. Click “OK” 18

19 T-test: SPSS procedure  Drag the variables into the proper boxes  define values for the independent variable 19

20 One note of caution about independent t-tests It is important to ensure that the assumption of homogeneity of variance is met: To do so: Look at the column labelled Levene’s Test for Equality of Variance. If the Sig. value is less than.05 then the assumption of homogeneity of variance has been broken and you should look at the row in the table labelled Equal variances not assumed. If the Sig. value of Levene’s test is bigger than.05 then you should look at the row in the table labelled Equal variances assumed. 20

21 T-test: SPSS output Look at the Levene’s Test …  If the Sig. value of the test is less than.05, groups have different variance. Read the row “Equal variances not assumed”  If the Sig. value of test is bigger than.05, read the row labelled “Equal variances assumed” 21

22 What to do if we want to statistically compare differences in three means? Analysis of variance (ANOVA) 22

23 Analysis of Variance (ANOVA)  ANOVAs test tells us if there are any difference among the different means but not how (or which) means differ.  ANOVAs are similar to t-tests and in fact an ANOVA conducted to compare two means will give the same answer as a t-test. 23

24 Calculating an ANOVA ANOVA formulas: calculating an ANOVA by hand is complicated and knowing the formulas are not necessary… Instead, we will rely on SPSS to calculate ANOVAs… 24

25 Example of One-Way ANOVAs Research question: Do mean child malnutrition (GAM) rates differ according to mother’s educational level (none, primary, or secondary/ higher)? 25

26 To calculate one-way ANOVAs in SPSS In SPSS, one-way ANOVAs are run using the following steps:  Click on “Analyze” drop down menu 1. Click on “Compare Means” 2. Click on “One-Way ANOVA…” 3. Move the independent and dependent variable into proper boxes 4. Click “OK” 26

27 ANOVA: SPSS procedure 1.Analyze; compare means; one-way ANOVA 2.Drag the independent and dependent variable into proper boxes 3.Ask for the descriptive 4.Click on ok 27

28 ANOVA: SPSS output Along with the mean for each group, ANOVA produces the F-statistic. It tells us if there are differences between the means. It does not tell which means are different.  Look at the F’s value and at the Sig. level 28

29 Determining where differences exist In addition to determining that differences exist among the means, you may want to know which means differ. There is one type of test for comparing means:  Post hoc tests are run after the experiment has been conducted (if you don’t have specific hypothesis). 29

30 ANOVA post hoc tests Once you have determined that differences exist among the means, post hoc range tests and pairwise multiple comparisons can determine which means differ. Tukeys post hoc test is the amongst the most popular and are adequate for our purposes…so we will focus on this test… 30

31 To calculate Tukeys test in SPSS In SPSS, Tukeys post hoc tests are run using the following steps: 1. Click on “Analyze” drop down menu 2. Click on “Compare Means” 3. Click on “One-Way ANOVA…” 4. Move the independent and dependent variable into proper boxes 5. Click on “Post Hoc…” 6. Check box beside “Tukey” 7. Click “Continue” 8. Click “OK” 31

32 Determining where differences exist in SPSS Once you have determined that differences exist among the means → you may want to know which means differ… Different types of tests exist for pairwise multiple comparisons

33 Common post-hoc tests Each test is characterized by different adjustments and works well under specific circumstances:  Same variance & size across the groups → Tukey test  Same variance, size slightly different → Gabriel test  Same variance, size very different → Hochberg’s GT2  Different variance → Games-Howell There are lots of different post hoc tests, characterized by different adjustment/ setting of the error rate for each test and for multiple comparisons. if interested, please feel free to investigate more and to try different tests – SPSS help might provide you some good hints! 33

34 Pairwise comparisons: SPSS output Once you have decided which post-hoc test is appropriate  Look at the column “mean difference” to know the difference between each pair  Look at the column Sig.: if the value is less than.05 then the means of the two pairs are significantly different 34

35 Now what if we would like to measure how well two variables are associated with one another? Correlations 35

36 Correlations  T-tests and ANOVAs measure differences between means  Correlations explain the strength of the linear relationship between two variables…  Pearson correlation coefficients (r) are the test statistics used to statistically measure correlations 36

37 Types of correlations  Positive correlations: Two variables are positively correlated if increases (or decreases) in one variable results in increases (or decreases) in the other variable.  Negative correlations: Two variables are negatively correlated if one increases (or decreases) and the other decreases (on increases).  No correlations: Two variables are not correlated if there is no linear relationship between them. Strong negative correlation No correlationStrong positive correlation -1--------------------------0---------------------------1 37

38 Illustrating types of correlations Perfect positive correlation Test statistic= 1 Positive correlation Test statistics>0 and <1 Perfect negative correlation Test statistic= -1 Negative correlation Test statistic -1 38

39 Example for the Kenya Data Correlation between children’s weight and height… Is this a positive or negative correlation?? In what range would the test statistics fall?

40 To calculate a Pearson’s correlation coefficient in SPSS In SPSS, correlations are run using the following steps: 1. Click on “Analyze” drop down menu 2. Click on “Correlate” 3. Click on “Bivariate…” 4. Move the variables that you are interested in assessing the correlation between into the box on the right 5. Click “OK” 40

41 example in SPSS… Using SPSS we get Pearson’s correlation (0.932) 41

42 1. Lets refresh briefly, what does a correlation of 0.932 mean?? 2. What does *** mean? 42

43 Summary Check out pg 171 of CFSVA manual for an overview of the test 43


Download ppt "Hypothesis testing Intermediate Food Security Analysis Training Rome, July 2010."

Similar presentations


Ads by Google