Ch 9 Internal and External Validity. Validity  The quality of the instruments used in the research study  Will the reader believe what they are readying.

Slides:



Advertisements
Similar presentations
PhD Research Seminar Series: Valid Research Designs
Advertisements

Andrea M. Landis, PhD, RN UW LEAH
Standardized Scales.
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
Increasing your confidence that you really found what you think you found. Reliability and Validity.
Validity and Reliability
Internal Validity EAF 410 July 25, 2001 Is the observed relationship due to something other than what I studied?
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Experimental Research Designs
Research Methods in Psychology
Chapter 9. Internal Validity Threats to Internal Validity Alternative explanations Alternative hypotheses The DV is NOT dependent upon the IV but instead.
Issues Related to Assessment with Diverse Populations
Assessment: Reliability, Validity, and Absence of bias
Research Design and Validity Threats
Who are the participants? Creating a Quality Sample 47:269: Research Methods I Dr. Leonard March 22, 2010.
MSc Applied Psychology PYM403 Research Methods Validity and Reliability in Research.
Causal-Comparative Research
Survey Research Chapter 17: How To Design And Evaluate Research In Education James Blackwood AED 615 – Fall Semester 2006.
Classroom Assessment A Practical Guide for Educators by Craig A
Experiments Pierre-Auguste Renoir: Barges on the Seine, 1869.
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
CAUSAL-COMPARATIVE RESEARCH Prepared for: Eddy Luaran Prepared by: Nur Hazwani Mohd Nor ( ) Noriziati Abd Halim ( ) Noor fadzilah.
EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS.
Chapter 8 Experimental Research
EDRS6208 Lecture Three Instruments and Instrumentation Data Collection.
Descriptive and Causal Research Designs
Research Tools and Techniques
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Ch 6 Validity of Instrument
Chapter 3 The Research Design. Research Design A research design is a plan of action for executing a research project, specifying The theory to be tested.
Group Discussion Explain the difference between assignment bias and selection bias. Which one is a threat to internal validity and which is a threat to.
Week 2 Internal and External Validity. Internal Validity: Evaluating Your Experiment from the Inside Internal Validity The concept of internal validity.
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Techniques of research control: -Extraneous variables (confounding) are: The variables which could have an unwanted effect on the dependent variable under.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
Validity RMS – May 28, Measurement Reliability The extent to which a measurement gives results that are consistent.
1 Experimental Research Cause + Effect Manipulation Control.
Research methods and statistics.  Internal validity is concerned about the causal-effect relationship in a study ◦ Can observed changes be attributed.
Research in Communicative Disorders1 Research Design & Measurement Considerations (chap 3) Group Research Design Single Subject Design External Validity.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
Research Methods in Psychology Chapter 2. The Research ProcessPsychological MeasurementEthical Issues in Human and Animal ResearchBecoming a Critical.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Chapter 10 Experimental Research Gay, Mills, and Airasian 10th Edition
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Reliability and Validity Threats to Internal Validity Da Lee Caryl, Fall 2006.
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Chapter Eight: Quantitative Methods
CJ490: Research Methods in Criminal Justice UNIT #4 SEMINAR Professor Jeffrey Hauck.
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
 Describe the defining characteristics of quantitative research studies.  List and describe the basic steps in conducting quantitative research studies.
Career Counseling: A Holistic Approach
Can you hear me now? Keeping threats to validity from muffling assessment messages Maureen Donohue-Smith, Ph.D., RN Elmira College.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Assist. Prof. Merve Topcu Department of Psychology, Çankaya University
EXPERIMENTAL RESEARCH
Lecture 5 Validity and Reliability
Reliability and Validity
Reliability and Validity in Research
Chapter 2 Sociological Research Methods
Chapter Eight: Quantitative Methods
External Validity.
Group Experimental Design
Reminder for next week CUELT Conference.
Chapter 8 VALIDITY AND RELIABILITY
Misc Internal Validity Scenarios External Validity Construct Validity
Presentation transcript:

Ch 9 Internal and External Validity

Validity  The quality of the instruments used in the research study  Will the reader believe what they are readying  It’s not be the instrument that needs to be validated but how the instrument is to be used  Questions --- Is the data accurate? --- Is the data meaningful? --- Is the data creditable? --- Is the data useful? --- Is the data appropriate?

External validity  Can the researcher use what have been observed in the research to make global generalizations beyond the research itself?  Does the instrument measure what it is suppose to?  Is the research sample representative of the population?  Using evidence to support external validity  Does the study have sufficient controls to insure that the conclusions or inferences are truly warranted? 1. Face validity: looks like it measures a particular characteristic, concepts, or phenomena

External validity (Continued) 2. Content validity: measures the expected outcomes. Are the results similar to what others familiar with the study’s contents might contend? 3. Criterion validity: the relationship of scores on an instrument to other similar instruments 4. Construct validity: measures characteristics, concepts, theories, phenomenon that cannot be directly observed but inferred, such as to what is inferred in the review of literature

Threats to Internal Validity  The extent to which the research design, instrument, and the data collected allow the researcher to draw accurate conclusions about the relationship within the data.  The research has not been affected by any interference with the research environment that would give the results of the research the appearance of something else other than what the research originally intended.  Threats to internal validity 1. Subject characteristics: too much or too little representation of a characteristic: gender, age, race, beliefs, attitudes, intelligence, etc.

Threats to Internal Validity (Continued 1) 2. Mortality: Participants dropping out of the study 3. Location: Comfort, distance, resources, need consistency 4. Instrumentation: Consistent use of the instrument a. Instrument decay: nature of the instrument causes change in interpretation b. Data collector characteristics: may interpret things differently, need consistency

Threats to Internal Validity (Continued 2) c. Data collector bias: have the possibility of distorting information, need standards or ignorance 5. History: Unexpected evens occur, 9/11, 921earthquake, death in a family, accidents, storms, etc. 6. Maturation: Too much time taken for the completion of the instrument, the passage of time 7. Regression: Increased performance may occur due to practice or intervention, such as special ed assistance

Threats to Internal Validity (Continued 3) 8. Attitude: Special treatment of participants, special favors, control vs. experimental group 9. Implementation: Consistency among participants 10. Testing: Difference between pre-test & post-test, practice 11. Sources of bias: The assessment, the administration of the assessment, the interpretation of the assessment

Threats to Internal Validity (Continued 4) a. Assessment bias: Offensive questions Inaccurate and unfair penalties related to scores b. Bias in administration: Examiner Students Setting C. Bias in interpretation: Criterion referenced Norm referenced

Ways to overcome treats to internal validity  Prepare alternative strategies for the unexpected  Be consistent in the treatment and relationship with all participants  Use an appropriate research design that will assist the researcher to gather, collect, and analyze the data in the most efficient manner  Examining items for bias: Panel review Empirical analysis What is the preferred method to examine assessment bias?

Triangulation  Compare three forms of information for similar results 1. Observation 2. Interviews 3. Data gathering 4. Focus groups

Is the test appropriate to the population?  What is the composition of the test-taking population?  To what extent can the assessment be administered without encumbrance to all members of the population?  Is there a translated version, adapted version or accommodated version of the test?  Are there recommendations for alternative testing procedures?  Has the planned accommodation been assessed in terms of it’s impact on validity and reliability of test scores?

Conclusion  The more valid and reliable research instruments are the more likely one will draw the appropriate conclusions from the collected data and solve the research problem in a creditable fashion.