Download presentation
Presentation is loading. Please wait.
Published byShanna Watson Modified over 9 years ago
1
Significance Testing 10/15/2013
2
Readings Chapter 3 Proposing Explanations, Framing Hypotheses, and Making Comparisons (Pollock) (pp. 58-76) Chapter 5 Making Controlled Comparisons (Pollock) Chapter 4 Making Comparisons (Pollock Workbook)
3
Homework and Exams Homework 3 due today Exam 2 on Thursday
4
OPPORTUNITIES TO DISCUSS COURSE CONTENT
5
Office Hours For the Week When – Wednesday 10-12 – Thursday 8-12 – And by appointment
6
Course Learning Objectives 1.Students will learn the basics of research design and be able to critically analyze the advantages and disadvantages of different types of design. 2.Students will achieve competency in conducting statistical data analysis using the SPSS software program.
7
Bivariate Data Analysis CROSS-TABULATIONS and Compare Means
8
Running a Test Select and Open a Dataset in SPSS Run either – A cross tab with column %’s (two categorical variables) – A compare means test (involves a categorical and continuous variable)
9
What are Cross Tabs? a simple and effective way to measure relationships between two variables. also called contingency tables- because it helps us look at whether the value of one variable is "contingent" upon that of another
10
When To Use Compare Means? A way to compare ratio variables by controlling for an ordinal or nominal variable – One ordinal vs. a ratio or interval – One nominal vs. a ratio or interval This shows the average of each category
11
Running Cross Tabs Select, Analyze – Descriptive Statistics – Cross Tabulations
12
Running Cross-Tabs Dependent variable is usually the row Independent variable is usually the column. We have to use the measures available
13
Lets Add Some Percent's Click on CellsCell Display
14
In SPSS Open the States.SAV Analyze – Compare Means – Means
15
Where the Stuff Goes Your categorical variable goes in the independent List Your continuous variable goes in the Dependent List
16
Hypothesis Testing
17
Why Hypothesis Testing To determine whether a relationship exists between two variables and did not arise by chance. (Statistical Significance) To measure the strength of the relationship between an independent and a dependent variable? (association)
18
What is Statistical Significance? The ability to say that that an observed relationship is not happening by chance. It is not causality It doesn't mean the finding is important or that it has any real world application (beware of large samples) Practical significance is often more important
19
Determining Statistical Significance Establishing parameters or “confidence intervals” Are we confident that our relationship is not happening by chance? We want to be rigorous (we usually use the 95% confidence interval any one remember why)
20
How do we establish confidence Establishing a “p” value or alpha value This is the amount of error we are willing to accept and still say a relationship exists
21
P-values or Alpha levels p<.05 (95% confidence level) - There is less than a 5% chance that we will be wrong. p<.01. (99% confidence level) 1% chance of being wrong p<.001 (99.9 confidence level) 1 in 1000 chance of being wrong
22
Problems of the Alpha level (p-value) Setting it too high (e.g..10) Setting it too low (.001) We have to remember our concepts and our units of analysis
23
You should always use the 95% Confidence interval (p<.05) unless there is a good reason not to.
24
STATING HYPOTHESES
25
Testing a hypothesis Before we can test it, we have to state it – The Null Hypothesis- There is no relationship between my independent and dependent variable – The Alternate Hypothesis We are testing for Significance: We are trying to disprove the null hypothesis and find it false!
26
About the Null
27
The Alternate Hypothesis Also called the research hypothesis State it clearly State an expected direction
28
After testing, the Null is either True- no relationship between the groups, in which case the alternate hypothesis is false---- Nothing is going on (except by chance)! False- there is a relationship and the alternative hypothesis is correct-- something is going on (statistically)!
29
It seems pretty obvious whether or not you have a statistically significant relationship, but we can often goof things up.
30
DECISION TYPES AND ERRORS
31
Keep or Reject the Null?
32
Errors and Decisions
33
A Type I Error Type I Error- the incorrect or mistaken rejection of a true null hypothesis (a false alarm)
35
A Type II error A Type II Error- accepting a null- hypothesis when it should have been rejected. (denial)
36
Type I and II (Climate Change)
37
You do not want to make either error
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.