Leedy and Ormrod Ch. 11 Gray Ch. 14

Slides:



Advertisements
Similar presentations
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Advertisements

T-Tests.
t-Tests Overview of t-Tests How a t-Test Works How a t-Test Works Single-Sample t Single-Sample t Independent Samples t Independent Samples t Paired.
T-Tests.
Lecture 3: Chi-Sqaure, correlation and your dissertation proposal Non-parametric data: the Chi-Square test Statistical correlation and regression: parametric.
Chapter Seventeen HYPOTHESIS TESTING
Statistics II: An Overview of Statistics. Outline for Statistics II Lecture: SPSS Syntax – Some examples. Normal Distribution Curve. Sampling Distribution.
Differences Between Group Means
Correlation Patterns. Correlation Coefficient A statistical measure of the covariation or association between two variables. Are dollar sales.
Chapter Eighteen MEASURES OF ASSOCIATION
Final Review Session.
Data Analysis Statistics. Inferential statistics.
Ch. 14: The Multiple Regression Model building
PSY 307 – Statistics for the Behavioral Sciences Chapter 19 – Chi-Square Test for Qualitative Data Chapter 21 – Deciding Which Test to Use.
Data Analysis Statistics. Levels of Measurement Nominal – Categorical; no implied rankings among the categories. Also includes written observations and.
Summary of Quantitative Analysis Neuman and Robson Ch. 11
Chapter 14 Inferential Data Analysis
Statistical hypothesis testing – Inferential statistics II. Testing for associations.
COURSE: JUST 3900 Tegrity Presentation Developed By: Ethan Cooper Final Exam Review.
Chapter 12 Inferential Statistics Gay, Mills, and Airasian
Inferential Statistics
Statistics for the Social Sciences Psychology 340 Fall 2013 Thursday, November 21 Review for Exam #4.
AM Recitation 2/10/11.
Estimation and Hypothesis Testing Faculty of Information Technology King Mongkut’s University of Technology North Bangkok 1.
Inferential Statistics: SPSS
Selecting the Correct Statistical Test
Chapter 13: Inference in Regression
Simple Covariation Focus is still on ‘Understanding the Variability” With Group Difference approaches, issue has been: Can group membership (based on ‘levels.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Inferential Statistics.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 22 Using Inferential Statistics to Test Hypotheses.
Statistics Definition Methods of organizing and analyzing quantitative data Types Descriptive statistics –Central tendency, variability, etc. Inferential.
Which Test Do I Use? Statistics for Two Group Experiments The Chi Square Test The t Test Analyzing Multiple Groups and Factorial Experiments Analysis of.
Statistics 11 Correlations Definitions: A correlation is measure of association between two quantitative variables with respect to a single individual.
Statistics 101: The 95% Rule David Newman, PhD. Levels of Data Nominal Ordinal Interval Ratio Binary--- The Magic Variable Categorical Continuous.
Examining Relationships in Quantitative Research
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 26.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
ITEC6310 Research Methods in Information Technology Instructor: Prof. Z. Yang Course Website: c6310.htm Office:
Chapter 16 Data Analysis: Testing for Associations.
Chapter 14 – 1 Chapter 14: Analysis of Variance Understanding Analysis of Variance The Structure of Hypothesis Testing with ANOVA Decomposition of SST.
CHI SQUARE TESTS.
ANALYSIS PLAN: STATISTICAL PROCEDURES
© 2008 McGraw-Hill Higher Education The Statistical Imagination Chapter 11: Bivariate Relationships: t-test for Comparing the Means of Two Groups.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and Methods and Applications CHAPTER 15 ANOVA : Testing for Differences among Many Samples, and Much.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Introducing Communication Research 2e © 2014 SAGE Publications Chapter Seven Generalizing From Research Results: Inferential Statistics.
Inferential Statistics. Explore relationships between variables Test hypotheses –Research hypothesis: a statement of the relationship between variables.
Chapter 15 The Chi-Square Statistic: Tests for Goodness of Fit and Independence PowerPoint Lecture Slides Essentials of Statistics for the Behavioral.
Soc 3306a Lecture 7: Inference and Hypothesis Testing T-tests and ANOVA.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright c 2001 The McGraw-Hill Companies, Inc.1 Chapter 11 Testing for Differences Differences betweens groups or categories of the independent variable.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 11 Testing for Differences Differences betweens groups or categories of the independent.
Chapter 13 Understanding research results: statistical inference.
Jump to first page Inferring Sample Findings to the Population and Testing for Differences.
SUMMARY EQT 271 MADAM SITI AISYAH ZAKARIA SEMESTER /2015.
Biostatistics Regression and Correlation Methods Class #10 April 4, 2000.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
Interpretation of Common Statistical Tests Mary Burke, PhD, RN, CNE.
Chapter 22 Inferential Data Analysis: Part 2 PowerPoint presentation developed by: Jennifer L. Bellamy & Sarah E. Bledsoe.
Oneway ANOVA comparing 3 or more means. Overall Purpose A Oneway ANOVA is used to compare three or more average scores. A Oneway ANOVA is used to compare.
Chapter 15 Analyzing Quantitative Data. Levels of Measurement Nominal measurement Involves assigning numbers to classify characteristics into categories.
Appendix I A Refresher on some Statistical Terms and Tests.
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc.,All Rights Reserved. Part Four ANALYSIS AND PRESENTATION OF DATA.
Bivariate Association. Introduction This chapter is about measures of association This chapter is about measures of association These are designed to.
CHAPTER 15: THE NUTS AND BOLTS OF USING STATISTICS.
REGRESSION G&W p
Introduction to Regression Analysis
Ass. Prof. Dr. Mogeeb Mosleh
Statistics II: An Overview of Statistics
Presentation transcript:

Leedy and Ormrod Ch. 11 Gray Ch. 14 Data Analysis Leedy and Ormrod Ch. 11 Gray Ch. 14

The difference between Parametric and Non-Parametric statistics -appropriate for interval/ratio data -generalizable to a population -assumes normal distributions Non-Parametric Statistics: -used with nominal/ordinal data -not generalizable to a population -does not assume normal distributions

Contingency (Cross-Tabs) Analysis and Chi-Square or Gamma/Tau-b - non-parametric (non-normal distributions) statistics Assumptions Nominal or ordinal (categorical) data Any type of distribution The hypothesis test: The null hypothesis: the two (or more) samples come from the same distribution

Contingency (cont.) Conducting the Analysis: a. calculate percentages within the categories of the IV and compare across the categories of the DV. Are there differences in the outcomes? b. for nominal Chi-square statistic: is the relationship (the above differences) real? Phi, Cramer's V, etc.: how strong is the relationship? c. for ordinal t-test for gamma, tau-b: is the relationship (the above differences) real? Gamma, tau-b: how strong and what direction?

T-Tests (parametric) for Means and Proportions The t-test is used to determine whether sample(s) have different means. Essentially, the t-test is the ratio between the sample mean difference and the standard error of that difference. The t-test makes some important assumptions: Interval/Ratio level data one or two levels of one or two variables normal distributions equal variances (relatively). Use the Levene’s test to determine whether variances are equal.

T-tests (cont.) a. The one sample t-test: tests a sample mean against a known population mean The null hypothesis tests if the sample mean is equal to the population mean.   b. The independent samples t-test: tests whether the mean of one sample is different from the mean of another sample. The null hypothesis tests if the mean of sample 1 is equal to the mean of sample 2.   Note: With independent t-tests, you must pay attention to the standard error of the sample(s). There are two ways to estimate the standard error. The Levene test is used in SPSS to do this. Equal variances: the two samples are relatively equal in size & distribution Non-equal variances: the two samples are not equal in variance (large discrepancy) c. The paired group t-test (dependent or related samples) tests if two groups within the overall sample are different on the same dependent variable The null hypothesis tests if the mean of (var1 - var2) is equal to 0. Overall, you will be looking for the t-value and its corresponding p-value. Depending on your alpha level, you will accept/reject the null hypothesis based on these numbers.

ANOVA (parametric) Analysis of Variance, or ANOVA, is testing the difference in the means among 3 or more different samples. One-way ANOVA Assumptions: One independent variable -- categorical with two+ levels Dependent variable -- interval or ratio Two-way ANOVA Assumptions Two or more independent variables -- categorical with two+ levels Dependent variable -- interval or ratio level Analysis will include main effects and an interaction term. ANOVA is testing the ratio (F) of the mean squares between groups and within groups. Depending on the degrees of freedom, the F score will show if there is a difference in the means among all of the groups.

ANOVA (cont.) One-way ANOVA will provide you with an F-ratio and its corresponding p-value. If there is a large enough difference between the between groups mean squares and the within groups mean squares, then the null hypothesis will be rejected, indicating that there is a difference in the mean scores among the groups. However, the F-ratio does not tell you where those differences are. You can do ad-hoc comparisons such as the Tukey-b, Bonferroni or Scheffe test to do this. Two-way ANOVA will provide you with an F-ratio and its corresponding p-value as well as F-ratios and p-values for each main effect and interaction term. When the interaction is significant, the interaction means (in SPSS, ask for this under options in GLM) should also be interpreted.

Correlation (parametric) Used to test the presence, strength and direction of a linear relationship among variables. Correlation is a numerical expression that signifies the relationship between two variables. Correlation allows you to explore this relationship by 'measuring the association' between the variables. Correlation is a 'measure of association' because the correlation coefficient provides the degree of the relationship between the variables. Correlation does not infer causality! Typically, you need at least interval and ratio data. However, you can run correlation with ordinal level data with 5 or more categories.

Correlation (cont.) a. The relationships: Essentially, there are four types of relationships: (1) positive, (2) negative, (3) curvilinear, and (4) no relationship.   b. The hypotheses and tests: The correlation statistic (Pearson's r) tests the null hypothesis that there is no relationship between the variables. c. The Correlation Coefficient : Pearson's r, the correlation coefficient, is the numeric value of the relationship between variables. The correlation coefficient is a percentage and can vary between -1 and +1. If no relationship exists, then the correlation coefficient would equal 0. Pearson's r provides an (1) estimate of the strength of the relationship and (2) an estimate of the direction of the relationship. If the correlation coefficient lies between -1 and 0, it is a negative (inverse) relationship, 0 and +1, it is a positive relationship and is 0, there is no relationship The closer the coefficient lies to -1 or +1, the stronger the relationship.    

Correlation (cont.) d. Coefficient of determination: Related to the correlation coefficient is the coefficient of determination. This statistic provides the percentage of the variance accounted for both variables (x & y). To calculate the determination coefficient, you square the r value. In other words, if you had an r of 90, your coefficient of determination would account for just 81 percent of the variance between the variables. e. Partial Correlation: When you need to 'control' for the effect of variables, you can use partial correlation.  

Simple (Bivariate) and Multiple (Multivariate) Regression Regression is used to model, calculate, and predict the pattern of a linear relationship among two or more variables. There are two types of regression -- simple & multiple a. Assumptions Note: Variables should be approximately normally distributed. If not, recode and use non-parametric measures. Dependent Variable: at least interval (can use ordinal if using summated scale) Independent Variable: should be interval. Independent variables should be independent of each other, not related in any way. You can use nominal if it is binary or 'dummy' variable (0,1)

Regression (cont.) b. Tests c. Statistics d. Limitations Overall: The null tests that the regression (estimated) line no better predicting dependent variable than the mean line Coefficients (slope "b", etc.): That the estimated coefficient equals 0 c. Statistics Overall: R-squared, F-test Coefficient: t tests d. Limitations Only addresses linear patterns Multicollinearity may be a problem

Useful Sources Agresti and Finley Statistical Methods for the Social Sciences Tabachnick and Fidell (2001) Using Multivariate Statistics For writing up the above statistics, “From Numbers to Words” Links at bottom of Soc302 webpage