Issues in Evaluating Educational Research

Slides:



Advertisements
Similar presentations
PhD Research Seminar Series: Valid Research Designs
Advertisements

Andrea M. Landis, PhD, RN UW LEAH
Experimental and Quasi-Experimental Research
Defining Characteristics
Experimental Research Designs
Jeff Beard Lisa Helma David Parrish Start Presentation.
Who are the participants? Creating a Quality Sample 47:269: Research Methods I Dr. Leonard March 22, 2010.
1 Exploring Quasi-Experiments Lab 5: May 9, 2008 Guthrie, J.T., Wigfield, A., & VonSecker, C. (2000). Effects of integrated instruction on motivation and.
Educational Research by John W. Creswell. Copyright © 2002 by Pearson Education. All rights reserved. Slide 1 Chapter 11 Experimental and Quasi-experimental.
Chapter 9 Experimental Research Gay, Mills, and Airasian
Experimental Research
L1 Chapter 11 Experimental and Quasi- experimental Designs Dr. Bill Bauer.
Experimental Design The Gold Standard?.
Research Design Methodology Part 1. Objectives  Qualitative  Quantitative  Experimental designs  Experimental  Quasi-experimental  Non-experimental.
Experimental and Quasi-Experimental Designs
I want to test a wound treatment or educational program in my clinical setting with patient groups that are convenient or that already exist, How do I.
Group Discussion Explain the difference between assignment bias and selection bias. Which one is a threat to internal validity and which is a threat to.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 11 Experimental Designs.
Research Strategies Chapter 6. Research steps Literature Review identify a new idea for research, form a hypothesis and a prediction, Methodology define.
PYGMALION EFFECT: TEACHERS’ EXPECTATIONS AND HOW THEY IMPACT STUDENT ACHIEVEMENT Glen Gochal Professor O’Connor-Petruso Seminar in Applied Theory and Research.
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
Techniques of research control: -Extraneous variables (confounding) are: The variables which could have an unwanted effect on the dependent variable under.
Chapter Four Experimental & Quasi-experimental Designs.
Understanding Research Design Can have confusing terms Research Methodology The entire process from question to analysis Research Design Clearly defined.
Notes on Research Design You have decided –What the problem is –What the study goals are –Why it is important for you to do the study Now you will construct.
1 Experimental Research Cause + Effect Manipulation Control.
CDIS 5400 Dr Brenda Louw 2010 Validity Issues in Research Design.
Experimental Design Chapter 1 Research Strategies and the Control of Nuisance Variables.
Chapter 10 Experimental Research Gay, Mills, and Airasian 10th Edition
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Chapter Eight: Quantitative Methods
Causal Comparative Research Design
Can you hear me now? Keeping threats to validity from muffling assessment messages Maureen Donohue-Smith, Ph.D., RN Elmira College.
Research designs Research designs Quantitative Research Designs.
Educational Research Experimental Research Chapter 9 (8 th Edition) Chapter 13 (7 th Edition) Gay and Airasian.
Experimental Design Ragu, Nickola, Marina, & Shannon.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Chapter 9 Scrutinizing Quantitative Research Design.
Chapter 12 Quantitative Questions and Procedures.
Approaches to social research Lerum
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
EXPERIMENTAL RESEARCH
Causation & Research Designs
Experimental Research
Experiments Why would a double-blind experiment be used?
Experimental Research Designs
Chapter 8 Experimental Design The nature of an experimental design
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Ron Sterr Kim Sims Heather Cruz aka “The Carpool”
Chapter Eight: Quantitative Methods
Review of Research Types
Making Causal Inferences and Ruling out Rival Explanations
Research Methods: Concepts and Connections First Edition
Experiments and Quasi-Experiments
Quantitative Research
Evaluating research Is this valid research?.
The Nature of Probability and Statistics
DCE3004 RESEARCH METHODS: Quantitative Research Approach
Experiments and Quasi-Experiments
Introduction to Experimental Design
Experiments: Part 2.
Experimental Research
Group Experimental Design
Chapter 11 EDPR 7521 Dr. Kakali Bhattacharya
Experimental Research
Types of Designs: R: Random Assignment of subjects to groups
Research Methods & Statistics
Reminder for next week CUELT Conference.
Misc Internal Validity Scenarios External Validity Construct Validity
Causal Comparative Research Design
Presentation transcript:

Issues in Evaluating Educational Research

Evaluation of Research Validity Validity: Can the findings and conclusion be trusted? 4 steps to analyzing a research study: What is the research question(s)? Does the research design match the research question(s)? How was the study conducted? Are there rival explanations for the results?

Evaluation of Research Validity Step1: Find the research question(s) REVIEW: What are descriptive questions? What are experimental questions?

Evaluation of Research Validity Step 2 Confirm: Does the design match the question(s)? 2 Key features Assignment of study participants Manipulation of an independent variable

Example 1 Question: Does increasing the amount of professional development teachers receive increase student achievement? What type of question is this: Causal or descriptive? What type of research design is needed: Experimental or descriptive? If descriptive, what type? Who are the participants? What are the independent and dependent variables? Pg 43

Example 1 continued Study participants are randomly assigned to two or more comparison groups that receive different amounts of professional development. Is this a true experimental or quasi-experimental design? Why? Study participants are assigned to professional development groups based on some kind of characteristic such as grade level or years of teaching experience. Is this a true experimental or quasi-experimental design? Pg 43

Example 2 Question: Does teacher professional development have a positive association with student achievement? What type of question is this: Causal or descriptive? What type of research design is needed: Experimental or descriptive? If descriptive, what type? Who are the participants? What are the independent and dependent variables? Pg 43

Evaluation of Research Validity Step 3: How was treatment conducted There should be enough information so that the study can be replicated. Without this information it can be difficult to judge validity. Four key areas: Participants Treatment Data collection Data Analysis

Participants Report should describe Number of participants Characteristics of participants – not only persons but schools and districts as well Look for characteristics that might influence results Student characteristics (ex: grade level, gender, socioeconomic status etc.) Teacher (classroom) characteristics (ex: experience, grade level, class size etc.) School characteristics (ex: # of students, teachers, or paras, location, grade levels etc.) District characteristics (ex: number of schools, number of students, location etc.)

Participants continued Participant Selection How were they selected Often nonrandom in education research When nonrandom, conclusions can only be made about the sample of participants in the study Without random assignment, selection bias is a concern Ex: if a researcher selects teachers to participate in one of two types of prof. development based on school location, the results could be influenced by the schools characteristics rather than the actual prof. development.

Treatment Treatment – program, policy, practice being studied In experimental research this is the independent variable Operational definitions must be provided Ex: professional development is a class in literacy instruction that teachers attend after school, 2 times per week for 2 hours. Construct validity – treatment is defined in a way that is a valid example or representation of the construct being studied. Ex: a definition of professional development that included going out to lunch 2 times per week would not have construct validity.

Treatment continued In addition to a valid definition, treatments must be implemented consistently Treatment fidelity should be reported Was the treatment carried out as planned? Were there any events that may have influenced the results during the treatment (ex: a conference on reading literacy during the reading literacy professional development study).

Data Collection 2 key factors in data collection validity The data collection instrument The data collection procedures Are the data collection instruments valid and reliable? Validity: it measures what it is supposed to measure Reliability: repeated measures in a short time produce similar results Use of data collection instruments Ex: a valid algebra test may not be a valid measure of a persons ability to teach algebra

Data Collection continued Data collection procedures How and when data are collected Procedures used to collect data can influence results Procedures must be carefully designed and described in the report Ex: giving survey responders anonymity (or not) may affect how they respond. Day and time of data collection Data collected the day before spring break may not produce valid results. In experimental research comparison and control groups must use the same data collection procedures

Data Analysis First determine if data collected is quantitative or qualitative Review: What is quantitative data? Qualitative? Quantitative Data Analysis Are statistics used to analyze the quantitative data? Statistics are analyzed and discussed in the research Qualitative Data Analysis Information is organized into categories Categories are coded Codes and coding procedures are explained

Evaluation of Research Validity Step 4: Detect rival explanations for the results (Threats to validity) Conclusions are presented at the end of the report When judging the conclusion think about possible rival explanations for the results. It is the job of the researcher to rule out possible rival explanations and how or why they do not apply to the study.

Rival explanations continued Quantitative research Especially important to rule out rival explanations when a treatment works Several common factors in quantitative studies of the effectiveness of an intervention in experimental research Selection bias – how subjects are assigned to treatment or control groups. Sample attrition – when more subjects leave the treatment group than the control group or vice versa Treatment diffusion – when participants in different control groups operate in the same environment (p.49) History effects – changes that occur in long research studies Practice effects – repeated measures of the same individuals(pre and post tests) Regression towards the mean – extremely high or extremely low scores on pre test

Rival Explanations continued Qualitative research Main validity concern is credibility of results Use of verification methods –triangulation (136) Researcher and Participant Effects Researcher bias – researchers expectations influence the study. Blind data collectors Participant reactivity – participants react a certain way based on research context. (51). Design questions a certain way