11 Chapter 12 Quantitative Data Analysis: Hypothesis Testing © 2009 John Wiley & Sons Ltd. www.wileyeurope.com/college/sekaran.

Slides:



Advertisements
Similar presentations
Hypothesis Testing Steps in Hypothesis Testing:
Advertisements

6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Learning Objectives 1 Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Quantitative Data Analysis: Hypothesis Testing
Data Analysis Statistics. Inferential statistics.
Chapter Seventeen HYPOTHESIS TESTING
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
The Simple Regression Model
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Final Review Session.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Chapter 19 Data Analysis Overview
Data Analysis Statistics. Inferential statistics.
Ch. 14: The Multiple Regression Model building
PSY 307 – Statistics for the Behavioral Sciences Chapter 19 – Chi-Square Test for Qualitative Data Chapter 21 – Deciding Which Test to Use.
Today Concepts underlying inferential statistics
Chapter 14 Inferential Data Analysis
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
Statistical hypothesis testing – Inferential statistics II. Testing for associations.
Lecture 5 Correlation and Regression
Inferential Statistics
Leedy and Ormrod Ch. 11 Gray Ch. 14
Regression and Correlation Methods Judy Zhong Ph.D.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Chapter 4 Hypothesis Testing, Power, and Control: A Review of the Basics.
Understanding Multivariate Research Berry & Sanders.
Learning Objective Chapter 14 Correlation and Regression Analysis CHAPTER fourteen Correlation and Regression Analysis Copyright © 2000 by John Wiley &
Which Test Do I Use? Statistics for Two Group Experiments The Chi Square Test The t Test Analyzing Multiple Groups and Factorial Experiments Analysis of.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Chapter 12 Examining Relationships in Quantitative Research Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Examining Relationships in Quantitative Research
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 16 Data Analysis: Testing for Associations.
Regression Analysis © 2007 Prentice Hall17-1. © 2007 Prentice Hall17-2 Chapter Outline 1) Correlations 2) Bivariate Regression 3) Statistics Associated.
Lecture 10: Correlation and Regression Model.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Examining Relationships in Quantitative Research
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Environmental Modeling Basic Testing Methods - Statistics III.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
11 Chapter 5 The Research Process – Hypothesis Development – (Stage 4 in Research Process) © 2009 John Wiley & Sons Ltd.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Biostatistics Regression and Correlation Methods Class #10 April 4, 2000.
Copyright © 2008 by Nelson, a division of Thomson Canada Limited Chapter 18 Part 5 Analysis and Interpretation of Data DIFFERENCES BETWEEN GROUPS AND RELATIONSHIPS.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Appendix I A Refresher on some Statistical Terms and Tests.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Analyze Of VAriance. Application fields ◦ Comparing means for more than two independent samples = examining relationship between categorical->metric variables.
Chapter 18 Data Analysis Overview Yandell – Econ 216 Chap 18-1.
Research Methodology Lecture No :25 (Hypothesis Testing – Difference in Groups)
CHAPTER fourteen Correlation and Regression Analysis
Correlation and Regression
CHAPTER 29: Multiple Regression*
Ass. Prof. Dr. Mogeeb Mosleh
Product moment correlation
Analis Data dan Penyajian
Presentation transcript:

11 Chapter 12 Quantitative Data Analysis: Hypothesis Testing © 2009 John Wiley & Sons Ltd.

Type I Errors, Type II Errors and Statistical Power  Type I error (  ): the probability of rejecting the null hypothesis when it is actually true.  Type II error (  ): the probability of failing to reject the null hypothesis given that the alternative hypothesis is actually true.  Statistical power (1 -  ): the probability of correctly rejecting the null hypothesis. 2 © 2009 John Wiley & Sons Ltd.

Choosing the Appropriate Statistical Technique 3 © 2009 John Wiley & Sons Ltd.

Testing Hypotheses on a Single Mean  One sample t -test: statistical technique that is used to test the hypothesis that the mean of the population from which a sample is drawn is equal to a comparison standard. 4 © 2009 John Wiley & Sons Ltd.

Testing Hypotheses about Two Related Means  Paired samples t -test: examines differences in same group before and after a treatment.  The Wilcoxon signed-rank test: a non-parametric test for examining significant differences between two related samples or repeated measurements on a single sample. Used as an alternative for a paired samples t- test when the population cannot be assumed to be normally distributed. 5 © 2009 John Wiley & Sons Ltd.

Testing Hypotheses about Two Related Means - 2  McNemar's test: non-parametric method used on nominal data. It assesses the significance of the difference between two dependent samples when the variable of interest is dichotomous. It is used primarily in before-after studies to test for an experimental effect. 6 © 2009 John Wiley & Sons Ltd.

Testing Hypotheses about Two Unrelated Means  Independent samples t -test: is done to see if there are any significant differences in the means for two groups in the variable of interest. 7 © 2009 John Wiley & Sons Ltd.

Testing Hypotheses about Several Means  ANalysis Of VAriance (ANOVA) helps to examine the significant mean differences among more than two groups on an interval or ratio-scaled dependent variable. 8 © 2009 John Wiley & Sons Ltd.

Regression Analysis  Simple regression analysis is used in a situation where one metric independent variable is hypothesized to affect one metric dependent variable. 9 © 2009 John Wiley & Sons Ltd.

Scatter plot 10 © 2009 John Wiley & Sons Ltd.

Simple Linear Regression 11 Y X `0  1 © 2009 John Wiley & Sons Ltd.

Ordinary Least Squares Estimation 12 YiYi XiXi YiYi eiei ˆ © 2009 John Wiley & Sons Ltd.

SPSS Analyze  Regression  Linear 13 © 2009 John Wiley & Sons Ltd.

SPSS cont’d 14 © 2009 John Wiley & Sons Ltd.

Model validation 1.Face validity: signs and magnitudes make sense 2.Statistical validity: –Model fit: R 2 –Model significance: F-test –Parameter significance: t-test –Strength of effects: beta-coefficients –Discussion of multicollinearity: correlation matrix 3.Predictive validity: how well the model predicts –Out-of-sample forecast errors 15 © 2009 John Wiley & Sons Ltd.

SPSS 16 © 2009 John Wiley & Sons Ltd.

Measure of Overall Fit: R 2  R 2 measures the proportion of the variation in y that is explained by the variation in x.  R 2 = total variation – unexplained variation total variation  R 2 takes on any value between zero and one: –R 2 = 1: Perfect match between the line and the data points. –R 2 = 0: There is no linear relationship between x and y. 17 © 2009 John Wiley & Sons Ltd.

SPSS 18 = r (Likelihood to Date, Physical Attractiveness) © 2009 John Wiley & Sons Ltd.

Model Significance  H 0 :  0 =  1 =... =  m = 0 (all parameters are zero) H 1 : Not H 0 19 © 2009 John Wiley & Sons Ltd.

Model Significance  H 0 :  0 =  1 =... =  m = 0 (all parameters are zero) H 1 : Not H 0  Test statistic ( k = # of variables excl. intercept) F = (SS Reg / k ) ~ F k, n-1- k (SS e /( n – 1 – k) SS Reg = explained variation by regression SS e = unexplained variation by regression 20 © 2009 John Wiley & Sons Ltd.

SPSS 21 © 2009 John Wiley & Sons Ltd.

Parameter significance  Testing that a specific parameter is significant (i.e.,  j  0)  H 0 :  j = 0 H 1 :  j  0  Test-statistic: t = b j /SE j ~ t n-k-1 with b j = the estimated coefficient for  j SE j = the standard error of b j 22 © 2009 John Wiley & Sons Ltd.

SPSS cont’d 23 © 2009 John Wiley & Sons Ltd.

Conceptual Model 24 Physical Attractiveness Likelihood to Date + © 2009 John Wiley & Sons Ltd.

Multiple Regression Analysis  We use more than one (metric or non-metric) independent variable to explain variance in a (metric) dependent variable. 25 © 2009 John Wiley & Sons Ltd.

Conceptual Model 26 Perceived Intelligence Physical Attractiveness + + Likelihood to Date © 2009 John Wiley & Sons Ltd.

© 2009 John Wiley & Sons Ltd. 27

Conceptual Model 28 Perceived Intelligence Physical Attractiveness Likelihood to Date Gender © 2009 John Wiley & Sons Ltd.

Moderators  Moderator is qualitative (e.g., gender, race, class) or quantitative (e.g., level of reward) that affects the direction and/or strength of the relation between dependent and independent variable  Analytical representation Y = ß 0 + ß 1 X 1 + ß 2 X 2 + ß 3 X 1 X 2 with Y = DV X 1 = IV X 2 = Moderator 29 © 2009 John Wiley & Sons Ltd.

© 2009 John Wiley & Sons Ltd Moderators

interaction significant effect on dep. var. © 2009 John Wiley & Sons Ltd Moderators

Conceptual Model 32 Perceived Intelligence Physical Attractiveness Communality of Interests Likelihood to Date Gender Perceived Fit © 2009 John Wiley & Sons Ltd.

Mediating/intervening variable  Accounts for the relation between the independent and dependent variable  Analytical representation 1.Y = ß 0 + ß 1 X => ß 1 is significant 2.M = ß 2 + ß 3 X => ß 3 is significant 3.Y = ß 4 + ß 5 X + ß 6 M => ß 5 is not significant => ß 6 is significant 33 WithY = DV X = IV M = mediator © 2009 John Wiley & Sons Ltd.

Step 1 34 © 2009 John Wiley & Sons Ltd.

Step 1 cont’d 35 significant effect on dep. var. © 2009 John Wiley & Sons Ltd.

Step 2 36 © 2009 John Wiley & Sons Ltd.

Step 2 cont’d 37 significant effect on mediator © 2009 John Wiley & Sons Ltd.

Step 3 38 © 2009 John Wiley & Sons Ltd.

Step 3 cont’d 39 significant effect of mediator on dep. var. insignificant effect of indep. var on dep. Var. © 2009 John Wiley & Sons Ltd.