Elspeth Slayter, Ph.D. Assistant Professor, Salem State University Lecture notes on threats to validity, threats to trustworthiness and the optimization.

Slides:



Advertisements
Similar presentations
Andrea M. Landis, PhD, RN UW LEAH
Advertisements

Standardized Scales.
Evaluation Issues in Programs that Promote Self-Determination Anita Yuskauskas, Ph.D.
Introduction to Action research
Reading the Dental Literature
Reviewing and Critiquing Research
Research Design Week 4 Lecture 1 Thursday, Apr. 1, 2004.
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
Historical Research.
Introduction to Qualitative Research
Psych 231: Research Methods in Psychology
RESEARCH METHODS Lecture 19
 It’s an approach to research that examines a concept or phenomenon from the perspective of the individual who is experiencing it  The research purpose.
Validity, Reliability, & Sampling
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
Formulating the research design
© 2011 Pearson Prentice Hall, Salkind. Nonexperimental Research: Qualitative Methods.
Survey Designs EDUC 640- Dr. William M. Bauer
Reliability & Validity Qualitative Research Methods.
Qualitative Research Interviews Josh Fiala DIS /5/08.
Chapter 14 Overview of Qualitative Research Gay, Mills, and Airasian
EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS.
RESEARCH DESIGN.
Experimental Design The Gold Standard?.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Overview of Research Designs Qualitative. Outline Comparison of Qualitative and Quantitative Research Types of Qualitative Research Data Collection in.
Qualitative Research.
Program Evaluation Using qualitative & qualitative methods.
Research Methods in Psychology (Pp 1-31). Research Studies Pay particular attention to research studies cited throughout your textbook(s) as you prepare.
Validity & Reliability Trustworthiness
V ALIDITY IN Q UALITATIVE R ESEARCH. V ALIDITY How accurate are the conclusions you make based on your data analysis? A matter of degree Non-reification.
CHAPTER III IMPLEMENTATIONANDPROCEDURES.  4-5 pages  Describes in detail how the study was conducted.  For a quantitative project, explain how you.
Chapter 24 Trustworthiness and Integrity in Qualitative Research
 Collecting Quantitative  Data  By: Zainab Aidroos.
Evaluating a Research Report
Observation & Analysis. Observation Field Research In the fields of social science, psychology and medicine, amongst others, observational study is an.
Validity of Qualitative Data Carolyn Seaman UMBC Baltimore, USA Presented to the International Software Engineering Network (ISERN) 21 August 2001 Glasgow,
Reliability and Validity Why is this so important and why is this so difficult?
Qualitative Research Interviews March 25, What Are Qualitative Interviews? “…attempts to understand the world from the subjects' point of view,
Independent vs Dependent Variables PRESUMED CAUSE REFERRED TO AS INDEPENDENT VARIABLE (SMOKING). PRESUMED EFFECT IS DEPENDENT VARIABLE (LUNG CANCER). SEEK.
Why is research important Propose theories Test theories Increase understanding Improve teaching and learning.
Quantitative and Qualitative Approaches
Basic Qualitative Strategies Cap Peck, Research Question Beginning with less specific research questions Start with interests and a sense of the.
8. Observation Jin-Wan Seo, Professor Dept. of Public Administration, University of Incheon.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Post, post, post Structuralism to Post-structuralism Context is relevant Parts are as important as the whole and can change meaning of the whole How.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
Introduction to Validity True Experiment – searching for causality What effect does the I.V. have on the D.V. Correlation Design – searching for an association.
Module 2 Research Strategies. Scientific Method A method of learning about the world through the application of critical thinking and tools such as observation,
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 20 Enhancing Quality and Integrity in Qualitative Research.
How do you know your product “works”? And what does it mean for a product to “work”?
Chapter 17 Qualitative Research: General Principles PowerPoint developed by: E. Roberto Orellana & Jennifer Manuel.
Elspeth Slayter, Associate Professor School of Social Work, Salem State University.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Evaluating Qualitative Study Leili Salehi Health education & Promotion Department Public health School Alborz University of Medical Sciences.
Elspeth Slayter, Ph.D., Associate Professor School of Social Work, Salem State University 1.
Understanding Populations & Samples
CHAPTER OVERVIEW The Case Study Ethnographic Research
Principles of Quantitative Research
Hypothesis Testing, Validity, and Threats to Validity
Chapter Eight: Quantitative Methods
Quantitative vs Qualitative Research
CHAPTER OVERVIEW The Case Study Ethnographic Research
Presentation transcript:

Elspeth Slayter, Ph.D. Assistant Professor, Salem State University Lecture notes on threats to validity, threats to trustworthiness and the optimization of rigor in social work research

Administrative matters/Check-in Tools for critical thinking in “research- y language” Qualitative work: Threats to trustworthiness; Optimization of rigor Quantitative work: Threats to validity

QUALITATIVEQUANTITATIVE Concerned with how people think and feel about the topics of concern to the research Gather broader, more in-depth information from fewer respondents (micro-analysis) Open questions for greater depth and personal detail Use a structured survey instrument that asks all respondents the same questions in the same order to allow for statistical analysis Gather a narrow amount of information from a large number of respondents (macro- analysis Closed questions for quantification, can be coded and processed quickly

Not “opposite” of QN – a different way to answer different questions Different underlying assumptions about how individual/group behavior is best studied Reflects a systematic approach to the conduct of research

Rejection of threats to validity vs. evaluative standards, strategies for rigor Flexibility of design - need for transparency As subjectivity is embraced – need to document Careful reporting on implementation of approaches Note-taking, auditing, memo-ing REFLEXIVITY – self-awareness in process

Threats to trustworthiness Reactivity: Distorting effects of researcher’s presence on respondent beliefs/behaviors Researcher biases: Observations/Interpretations clouded by researcher’s worldview Respondent biases: Social desirability response to withholding information

1. Member Checking: Empowerment of respondents, keeps you honest 2. Negative Case Analysis: Testing and challenging interpretations to date 1. Fear management paper example 2. Searching for disconfirming evidence 3. Audit Trail: Replicability, accountable for interpretive decisions

4. Triangulation: Theory, method, observer, data 5. Prolonged Engagement: Goal is saturation 6. Peer Debriefing and Support (PDS) 1. Keeping us honest in interpretation 2. Emperor’s new clothes check

Interviews followed by memos Individual Listening to tapes Reading of transcripts Coding of memos, transcripts for themes Team Comparison of each transcript Write up of themes identified Member checking Review of transcripts, write-up

1. How can this study be generalizable with low n (sample size)? 2. Where are your hypotheses? 3. Why isn’t this study more objective? 4. Will attrition be a problem with so few in the study? 5. Why can’t you be more forthcoming about what your findings will look like in advance? 6. How is this different from journalism?

Tools for critical thinking, names

 Must specify type  Refers to the accuracy and trustworthiness of: instruments (surveys, variables) data (and how it was gathered) findings

1) The utility of the device or method that measures the issue at hand 2) The collective judgment of the research community that a concept, measure and method are valid

 Are results of the study applicable to settings other than the one in which the study was conducted?  Can be addressed by using: Large population samples Many control or comparison groups Replicating the study in a different setting or with a different population

Why?

 What are the challenges to external validity in your project?  Who is your study generalizable to?

 Looking at how a concept was operationalized and deciding whether or not “on the face of it” the measurement makes sense  Generally based upon consensus by researchers Example: Measuring prevalence of child abuse in families through parent interviews

 Achieved when a measurement has the appropriate content for “getting at” all the issues present in a complex construct  Think “inner mechanics”  Examples: Socioeconomic status Quality of life Client satisfaction

 Close fit between the construct being measured and the actual observations made  D  o your questions (and therefore variables) adequately “get at” the  construct your are measuring?  Examples: Does the IQ test measure intelligence? Does income alone measure SES?

Content  Technical presence of all the theory-pieces of the measure in question  Does it have all of the parts?  Presence of the parts Construct  Overall ‘vibe’ about whether the measure really gets at what it intends to  Do the parts make sense as a whole?  Quality of the parts  Representativeness

 Mostly an issue in longitudinal research  Concerned with reducing or maximizing errors in research design in several ways  Must say which type of threat to internal validity

 History (something happens to effect study’s process)  Maturation (something naturally happens along the way)  Testing effect (people get used to the testing, questions)  Measurement issues/instrumentation error (questions, inter-rater reliability)  Regression to the mean (over time, people will score close to the average)

 Selection bias/differential selection of participants (can happen even with randomization)  Attrition/mortality (subjects leave the study one way or another)  Reactive effects of participants (behavioral response to being in treatment/control group) “Hawthorne effect”  Diffusion of treatment (control or comparison groups end up experiencing some of the treatment effect…)  Interaction effect (of any of the above)

 Critical consumer of research What the heck the terms mean! What the authors don’t say  Comment on potential threats in end-of- semester proposal (limitations) Quantiative – threats to validity Qualitative – threats to trustworthiness, optimization of rigor