Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.

Slides:



Advertisements
Similar presentations
PhD Research Seminar Series: Valid Research Designs
Advertisements

Chapter 2 The Process of Experimentation
Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian
Cross Cultural Research
Survey Methodology Reliability and Validity EPID 626 Lecture 12.
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Validity and Reliability Chapter Eight.
Chapter 2: The Research Process
GROUP-LEVEL DESIGNS Chapter 9.
Research Methods in Psychology
Experiments. Types of experiments ‘so far’ Paired comparison Happy experiment watching Goon video Two independent groups Different treatments for each.
Reviewing and Critiquing Research
Reliability and Validity in Experimental Research ♣
MSc Applied Psychology PYM403 Research Methods Validity and Reliability in Research.
Causal-Comparative Research
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
SOWK 6003 Social Work Research Week 5 Measurement By Dr. Paul Wong.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Classroom Assessment A Practical Guide for Educators by Craig A
Formulating the research design
Reliability & Validity Qualitative Research Methods.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Descriptive and Causal Research Designs
26 August 2015© Academic Conferences Limited, Validity, Reliability and Generalisability by Dr Dan Remenyi Visiting Professor School of Systems.
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Research.
Chapter 3 An Overview of Quantitative Research
Chapter 3 How Psychologists Use the Scientific Method:
Action Research March 12, 2012 Data Collection. Qualities of Data Collection  Generalizability – not necessary; goal is to improve school or classroom.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Final Study Guide Research Design. Experimental Research.
Descriptive and Causal Research Designs
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
MODULE 3 INVESTIGATING HUMAN AND SOCIL DEVELOPMENT IN THE CARIBBEAN.
The Scientific Method in Psychology.  Descriptive Studies: naturalistic observations; case studies. Individuals observed in their environment.  Correlational.
Techniques of research control: -Extraneous variables (confounding) are: The variables which could have an unwanted effect on the dependent variable under.
The Research Enterprise in Psychology
EDU 8603 Day 6. What do the following numbers mean?
Independent vs Dependent Variables PRESUMED CAUSE REFERRED TO AS INDEPENDENT VARIABLE (SMOKING). PRESUMED EFFECT IS DEPENDENT VARIABLE (LUNG CANCER). SEEK.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Quantitative and Qualitative Approaches
CDIS 5400 Dr Brenda Louw 2010 Validity Issues in Research Design.
Data Collection and Reliability All this data, but can I really count on it??
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
Nursing research Is a systematic inquiry into a subject that uses various approach quantitative and qualitative methods) to answer questions and solve.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
Ch 9 Internal and External Validity. Validity  The quality of the instruments used in the research study  Will the reader believe what they are readying.
Chapter Eight: Quantitative Methods
RESEARCH METHODS IN INDUSTRIAL PSYCHOLOGY & ORGANIZATION Pertemuan Matakuliah: D Sosiologi dan Psikologi Industri Tahun: Sep-2009.
Measurement Experiment - effect of IV on DV. Independent Variable (2 or more levels) MANIPULATED a) situational - features in the environment b) task.
Chapter 6 - Standardized Measurement and Assessment
PSY 432: Personality Chapter 1: What is Personality?
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
10.1.  Probability sampling method  related to statistical probability and representatives ◦ Most rigorous ◦ Inferential statistical tests ◦ Samples.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Issues in Evaluating Educational Research
Reliability and Validity in Research
Dealing with Validity, Reliability, and Ethics
© 2012 The McGraw-Hill Companies, Inc.
Managerial Decision Making and Evaluating Research
Presentation transcript:

Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón

Relevance for understanding Validity & Reliability: “My tip for any student wishing to engage in research is to pay very specific and careful attention to validity and reliability because if you do not, your research is much less meaningful.” -Lora Lee Canter

A great study design has both… ValidityReliability When a study is found to have both validity and reliability, this adds to our knowledge base to our work and informs the decisions of educators

Qualitative Research Validity (sometimes) Trustworthiness (always) Reliability (always) Quantitative Research Validity (always) Reliability (always)

Vocabulary Terms The degree to which the conclusions drawn by the researcher come from the study results and are not by chance. Validity How a researcher convinces the reader that the findings are credible, appropriate, and fully developed. Trustworthiness The degree to which a study can be repeated with similar results. Reliability

Validity Three types of validity: Internal Did this study truly indicate that A, not some other variable cause (or not cause) B? External How do the conclusions of this study apply elsewhere? Construct How does the researcher define and measure the construct of the study?

Internal Validity Internal ThreatsDefinition HistorySomething occurs during the study that impacts the DV, but is unrelated to the IV. MaturationThe study occurs over a period of time in which participants get older; their incidental learning or experience affects the DV. Measurement issuesThe frequency and practice of assessment and/or the assessment devices used affect the DV. Group differencesThe experimental and control groups are not equivalent in terms of important variables or characteristics. Alternative causesChanges in the DV are the result of effects of the DV on the IV; rivalry between the control or treatment group. Pg. 66 “the approximate validity with which we infer that a relationship between 2 variables is causal or absence of relationship implies absence of cause” (Cook & Campbell, 1979)

Validity Three types of validity: Internal Did this study truly indicate that A, not some other variable cause (or not cause) B? External How do the conclusions of this study apply elsewhere? Construct How does the researcher define and measure the construct of the study?

External Validity External ThreatsDefinition Sample characteristics Affects the ability to generalize to other groups not included in the study; a sample that doesn’t have the same characteristics of the population. (Ex: ethnicity, gender, SES, performance, etc.) Treatment characteristics How the experimenters and the intervention affect external validity. (Ex: The treatment is so specific that it is hard to replicate.) Setting characteristics All of the resources and situations used by the researcher to implement the intervention and collect data. (Ex: computers, intervention program, lab setting, etc.) Pg “the extent to which an observed relationship among variables can be generalized beyond the conditions of the investigation to other populations, settings, and conclusions” (Rumrill & Cook, 2001)

Validity Three types of validity: Internal Did this study truly indicate that A, not some other variable cause (or not cause) B? External How do the conclusions of this study apply elsewhere? Construct How does the researcher define and measure the construct of the study?

Construct Validity Construct ThreatsDefinition Inadequate description of construct The construct of focus is too general and the components of the construct does not match the focus in the study. (Ex: reading vs fluency) Inadequate measurement of construct Using a limited number of measures or of methods to gather data about a construct. (Ex: Using surveys, interviews, questionnaires, etc.) Inadequate attention to levels of independent variable Occurs when there are only 1 or 2 levels of a multilevel variable are implemented; tool used is not explained, which may have had an impact to the IV (Ex: Graphic organizer) Pg “the degree to which a researcher truly measures the construct of focus in the study” (Boudah, 2011)

Reliability The degree to which a study can be repeated with similar results. “The extent that data collection, analysis, and interpretations are consistent given the same conditions” (Wiersma, 2000). Internal “The extent to which an independent researcher could replicate the study in other settings” (Boudah, 2011). External

Internal Reliability It is important for researchers to understand that the reliability of the measure chosen for evaluating the DV in a study impacts the reliability and validity of the study as a whole. Researchers can choose between the following: a.Use a published and standardized measure b.Create their own measure

Internal Reliability Reliability of an instrument Reliability of observation Reliability

Reliability of an Instrument Parallel forms Person’s score is similar when the person is given 2 forms of the same test Test-retest The degree to which a person achieves a similar score on a test when taken 2 different days Split-half The degree to which a person receives a similar score on ½ of the test items compared to the other ½. Cronbach’s alpha The statistical formula used to determine reliability based on a least 2 parts of a test.

The reliability of an instrument is measured by a reliability coefficient. Indicates the relationship between multiple administrations, multiple items or other analysis of evaluation measures. 0 = no relationship; no reliability 1 = perfect relationship; perfect reliability.8 = adequate relationship; acceptable reliability ______________________________________ No Adequate Perfect Relationship Relationship Relationship Reliability of an Instrument

Internal Reliability Reliability of an instrument Reliability of observation Reliability

Reliability of observation Researchers measure the reliability of observations by calculating the following : The degree to which 2 independent observers or scorers record similar observational data of the same situation. (Ex: TELPAS writing) Interobserver & Interscorer Agreements The degree to which an observer or scorer records similar data about the same observation or test on 2 different occasions. (Ex: Pre-test/Post-test ) Intraobserver & Intrascorer Agreements Pg

Reliability of observation A second consideration in internal reliability is treatment fidelity or fidelity to implementation. Researchers must ensure that teachers who are implementing an intervention are: trained in a similar manner have similar backgrounds observed and provided feedback evaluated using standard treatment fidelity checklist

Reliability “The extent to which an independent researcher could replicate the study in other settings” (Boudah, 2011). External Based on a thorough description of the following: methods complexity of the intervention training measurement setting

The credibility of qualitative inquiry depends on the following 3 inquiry elements: Trustworthiness How a researcher convinces the reader that the findings are credible, appropriate, and fully developed. 1 rigorous methods 2 the credibility of the researcher 3 philosophical belief in the value of qualitative inquiry

Trustworthiness The researcher should make the following explicit in their study: Professional training Biases and experience in the situation Understanding of the method of data analysis How issues of entry and continued evaluation or observation were handled The conceptual framework upon which the study was built the credibility of the researcher 2

Trustworthiness A fundamental appreciation of naturalistic inquiry, qualitative methods, inductive analysis, purposeful sampling and holistic thinking philosophical belief in the value of qualitative inquiry 3 In qualitative research, it is important to know that you as a researcher, have only 1 perspective on the situation, but it is your job to design the study where you will included various perspectives using various participants.

Trustworthiness Step 1: Truth value - How does a researcher establish that the description given is that of truth? rigorous methods 1

Trustworthiness Step 2: Building your credibility for your research rigorous methods 1 Applicability : Involves having a researcher include a detailed description of every aspect of the study. Consistency : Researchers could come to conclusions similar to those of the original researcher. Confirmability : Researchers keep detailed notes of each of these categories throughout the study.

Reflection: Reread this chapter Learn from the experiences of others Continue to read more research studies