Group Experimental Design

Slides:



Advertisements
Similar presentations
Andrea M. Landis, PhD, RN UW LEAH
Advertisements

Agenda Group Hypotheses Validity of Inferences from Research Inferences and Errors Types of Validity Threats to Validity.
Reading Improvement Through Homework Help Fatimah Washington Ed Fall 2010.
Validity of Quantitative Research Conclusions. Internal Validity External Validity Issues of Cause and Effect Issues of Generalizability Validity of Quantitative.
Experimental Research Designs
Chapter 8 Construct and External Validity in Experimental Research ♣ ♣ Construct Validity   External Validity   Cautions in Evaluating the External.
Who are the participants? Creating a Quality Sample 47:269: Research Methods I Dr. Leonard March 22, 2010.
Reliability and Validity in Experimental Research ♣
Chapter 9 Group-Level Research Designs. CHARACTERISTICS OF “IDEAL” EXPERIMENTS Controlling the Time Order of Variables Manipulating the Independent Variable.
MSc Applied Psychology PYM403 Research Methods Validity and Reliability in Research.
Sampling and Experimental Control Goals of clinical research is to make generalizations beyond the individual studied to others with similar conditions.
Educational Research by John W. Creswell. Copyright © 2002 by Pearson Education. All rights reserved. Slide 1 Chapter 11 Experimental and Quasi-experimental.
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
L1 Chapter 11 Experimental and Quasi- experimental Designs Dr. Bill Bauer.
Experimental Research
EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS.
Chapter 8 Experimental Research
Experimental Design The Gold Standard?.
Applying Science Towards Understanding Behavior in Organizations Chapters 2 & 3.
I want to test a wound treatment or educational program in my clinical setting with patient groups that are convenient or that already exist, How do I.
Research Design for Quantitative Studies
Group Discussion Explain the difference between assignment bias and selection bias. Which one is a threat to internal validity and which is a threat to.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Chapter 11 Experimental Designs
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 11 Experimental Designs.
Final Study Guide Research Design. Experimental Research.
Techniques of research control: -Extraneous variables (confounding) are: The variables which could have an unwanted effect on the dependent variable under.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Enhancing Rigor in Quantitative Research.
INTRO TO EXPERIMENTAL RESEARCH, continued Lawrence R. Gordon Psychology Research Methods I.
Understanding Research Design Can have confusing terms Research Methodology The entire process from question to analysis Research Design Clearly defined.
Research methods and statistics.  Internal validity is concerned about the causal-effect relationship in a study ◦ Can observed changes be attributed.
Review of Research Methods. Overview of the Research Process I. Develop a research question II. Develop a hypothesis III. Choose a research design IV.
Chapter 6 Research Validity. Research Validity: Truthfulness of inferences made from a research study.
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
Ch 9 Internal and External Validity. Validity  The quality of the instruments used in the research study  Will the reader believe what they are readying.
Experimental & Quasi-Experimental Designs Dr. Guerette.
Research Design. Time of Data Collection Longitudinal Longitudinal –Panel study –Trend study –Cohort study Cross-sectional Cross-sectional.
Chapter Eight: Quantitative Methods
Experimental and Ex Post Facto Designs
Experimental Research Design Causality & Validity Threats to Validity –Construct (particular to experiments) –Internal –External – already discussed.
Can you hear me now? Keeping threats to validity from muffling assessment messages Maureen Donohue-Smith, Ph.D., RN Elmira College.
School of Public Administration & Policy Dr. Kaifeng Yang 研究设计 : 实验研究的基本问题.
Research designs Research designs Quantitative Research Designs.
Assessing Impact: Approaches and Designs
CHOOSING A RESEARCH DESIGN
Issues in Evaluating Educational Research
SESRI Workshop on Survey-based Experiments
EXPERIMENTAL RESEARCH
Experimental Research
Experiments Why would a double-blind experiment be used?
Instructor’s manual Mass Media Research: An Introduction, 7th Edition
Design (3): quasi-experimental and non-experimental designs
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Ron Sterr Kim Sims Heather Cruz aka “The Carpool”
Statistical Analyses & Threats to Validity
Welcome.
Chapter Eight: Quantitative Methods
Introduction to Design
Chapter 6 Research Validity.
Experiments and Quasi-Experiments
Quantitative Research
SESRI Workshop on Survey-based Experiments
Writing a Research proposal
Experiments and Quasi-Experiments
Experiments: Validity, Reliability and Other Design Considerations
Experiments II: Validity and Design Considerations
Chapter 11 EDPR 7521 Dr. Kakali Bhattacharya
Reminder for next week CUELT Conference.
Misc Internal Validity Scenarios External Validity Construct Validity
Presentation transcript:

Group Experimental Design

Validity Validity is the degree to which the data are both meaningful and adequately represent the construct under study.  Content validity is the extent to which the measured variables represent all the possible measures of content or skills.  In surveys, do the questions represent all the possible questions that could be asked?  With respect to performance, does the particular measure, e.g., attendance or computation, represent all possible components of the construct, e.g.,  parent participation or mathematics

Relationship of Reliability to Validity Cannot have validity without reliability A poorly worded question on a survey cannot be valid because it is not reliable, It does not consistently assess participants’ attitude because it will be interpreted differently by the different participants.  A well-worded question could be reliable but still may not be valid It may be unrelated or tangentially related to the construct under study.  Similarly, poorly worded behavioral definitions will not yield reliable data

Can have reliability without validity Well-defined definitions will yield reliable data But such data still may not meaningfully or completely represent the target construct, attendance does not adequately represent parent participation

Table 6. 4 p. 172 Form of Validity What is for? How do we use it? What else do I need know? Content Validity Criteria Related Validity Construct Validity

Unit of Analysis The unit of analysis is the level of analysis (student, family, school, community, organization) by which the data need to be collected and analyzed to answer the questions asked. We must not only consider the questions, but also moderating and mediating variables implicit in our questions.

Sampling Representative of a given population Needs to be some target population

Probabilistic vs. Non-probabilistic Sampling Quantitative Sampling Probability Sampling Simple Random Stratified Multi-stage cluster Non-probability sampling Convenience Snowball

Experimental Design The purpose of an intervention (treatment or experimental) design is To determine the effect of the intervention (the independent or manipulated variables) on the outcomes (the dependent variables) To show causality: A caused X to occur

Experimental Design To control the effect of all other potential influencing variables, except for the particular intervention To show that it was the intervention alone that causes the outcome Potential extraneous (“other”) influencing variables The participants The treatments The procedures To reduce threats to internal validity

What is Internal Validity? Threat to Internal Validity What is it? Participants threat History Maturation Regression Selection Mortality Interactions with selection

What could you do about it? What do I need know? Treatment threats What is it? What could you do about it? What do I need know? Diffusion of Treatments Compensatory equalization Compensatory rivalry Resentful demoralization Procedure Threats Testing Instrumentation

External Validity: Insuring it applies to other situations Interaction selection and treatment Interaction of setting and treatment Interaction of history and treatment

What is common? What is not? What are challenges with each design? True Experimental Post test Experimental Quasi-Experimental –Pre-Post Quasi- experimental Post-test only

Between Groups Design

Factorial Design (2 x 3) Design a 3 X 3 X 4 Multiple Levels – we often call these grouping variables in data sets. PPT Clicker 5 Technology Instruction type Typical At-risk Special ED Risk Level

Main Effects vs. Interaction Effects Main Effect-Overall influence of the independent variable Interaction Effect- when the influence on one independent variable covaries (depends on) the other independent variable Look at p. 318 What does it mean to have an interaction effect?

REPEATED MEASURE DESIGN Time 1 Time 2 Time 3