Karin Hannes Centre for Methodology of Educational Research K.U.Leuven.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

RESEARCH CLINIC SESSION 1 Committed Officials Pursuing Excellence in Research 27 June 2013.
Introduction to Research Methodology
Reviewing and Critiquing Research
Standards for Qualitative Research in Education
Designing Case Studies
Education 3504 Week 3 reliability & validity observation techniques checklists and rubrics.
Publishing qualitative studies H Maisonneuve April 2015 Edinburgh, Scotland.
CHAPTER 10, qualitative field research
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
Qualitative Methods m Lisa m Angela.
Data Analysis, Interpretation, and Reporting
Business research methods: data sources
Scientific method - 1 Scientific method is a body of techniques for investigating phenomena and acquiring new knowledge, as well as for correcting and.
Sabine Mendes Lima Moura Issues in Research Methodology PUC – November 2014.
Critique of Research Outlines: 1. Research Problem. 2. Literature Review. 3. Theoretical Framework. 4. Variables. 5. Hypotheses. 6. Design. 7. Sample.
Southampton Education School Southampton Education School Dissertation Studies Rigour, Ethics, & Risk.
Chapter 2 Understanding the Research Process
Science Inquiry Minds-on Hands-on.
Reliability & Validity Qualitative Research Methods.
CHAPTER 3 RESEARCH TRADITIONS.
Case Study Research: A Primer Mark Widdowson, TSTA (P) University of Leicester.
Qualitative Studies: Case Studies. Introduction l In this presentation we will examine the use of case studies in testing research hypotheses: l Validity;
Qualitative Research.
Reporting & Ethical Standards EPSY 5245 Michael C. Rodriguez.
What research is Noun: The systematic investigation into and study of materials and sources in order to establish facts and reach new conclusions. Verb:
Validity & Reliability Trustworthiness
1 Research Paper Writing Mavis Shang 97 年度第二學期 Section VII.
Chapter 10 Qualitative Methods in Health and Human Performance.
Chapter 24 Trustworthiness and Integrity in Qualitative Research
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
MODULE 3 INVESTIGATING HUMAN AND SOCIL DEVELOPMENT IN THE CARIBBEAN.
Qualitative Papers. Literature Review: Sensitizing Concepts Contextual Information Baseline of what reader should know Establish in prior research: Flaws.
The University of Sydney Sydney School of Public Health Qualitative Health Research Collaboration (QHeRC) 23 rd Feb 2010 Sian Smith Research Fellow, Screening.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
Writing about Methods in Dissertations and Doctoral Studies
1 Research Paper Writing Mavis Shang 97 年度第二學期 Section III.
TRUSTWORTHINESS OF THE RESEARCH Assoc. Prof. Dr. Şehnaz Şahinkarakaş.
Anatomy of a Research Article Five (or six) major sections Abstract Introduction (without a heading!) Method (and procedures) Results Discussion and conclusions.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
Advanced thoughts on critical appraisal of qualitative research Esquire course 2015, University of Sheffield Karin Hannes Faculty of Psychology and Educational.
Ch 10 Methodology.
Foundation of Management Welcome! Lars Walter
Research for Nurses: Methods and Interpretation Chapter 1 What is research? What is nursing research? What are the goals of Nursing research?
RESEARCH An Overview A tutorial PowerPoint presentation by: Ramesh Adhikari.
Qualitative Research: How Can it Be True if There Aren’t Any Numbers?? The Quest of the Curious Skeptics Continues.
Section 1. Qualitative Research: Theory and Practice  Methods chosen for research dependant on a number of factors including:  Purpose of the research.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 20 Enhancing Quality and Integrity in Qualitative Research.
Analyzing & evaluating qualitative data Kim McDonough Northern Arizona University.
ABRA Week 3 research design, methods… SS. Research Design and Method.
1 References - Pranee Liamputtong & Douglas Ezzy, Qualitative Research Methods, Pranee Liamputtong & Douglas Ezzy, Qualitative Research Methods,
1 Prepared by: Laila al-Hasan. 1. Definition of research 2. Characteristics of research 3. Types of research 4. Objectives 5. Inquiry mode 2 Prepared.
Qualitative data analysis. Principles of qualitative data analysis Important for researchers to recognise and account for own perspective –Respondent.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Doctoral course, Ughent, March 21-22, 2016
Writing a sound proposal
Leacock, Warrican and Rose (2009)
Triangulation.
Research & Writing in CJ
Trustworthiness in Qualitative Research
CHAPTER 10, qualitative field research
Nature of Science Understandings for HS
Evaluating Qualitative Research
Research and Methodology
Introduction to Qualitative Research
RESEARCH BASICS What is research?.
Debate issues Sabine Mendes Lima Moura Issues in Research Methodology
Critiquing Qualitative Research
Presentation transcript:

Karin Hannes Centre for Methodology of Educational Research K.U.Leuven

 The more you appraise, the more it stifles creativity. The main criterion is relevance !  The more you appraise, the lesser the chance to end up with flawed results. The main criterion is quality !

 That’s no longer the question… I appraise!  The question is…

Which criteria am I going to use when evaluating methodological quality?

 MAKING SENSE OF THE MYRIAD OF CRITICAL APPRAISAL INSTRUMENTS

 Selection of appraisal instruments: › Used in recently published QES ( ) › Online available and ready to use › Broadly applicable to different qualitative research designs › Developed and supported by an organisation/institute/consortium or a context other than individual, academic interest.

 Three instruments fit the inclusion criteria: › Joanna Briggs Institute-Tool › Critical Appraisal Skills Programme-Tool ol.pdf › Evaluation Tool for Qualitative Studies  To facilitate comparison: › Criteria grouped under 11 headings › Cross-comparison of the criteria (main headings)

CriterionJBICASPETQS Screening QDetails study Theoretical frameworkNO Appropriateness designNO Data collection procedure Data-analysis procedure Findings ContextNO Impact of investigator BelievabilityNO Ethics Evaluation/OutcomeNO Value/Implication ResearchNO

CriterionJBICASPETQS Screening QDetails study Theoretical frameworkNO Appropriateness designNO Data collection procedure Data-analysis procedure Findings ContextNO Impact of investigator BelievabilityNO Ethics Evaluation/OutcomeNO Value/Implication ResearchNO All 3 instruments have focussed on the accuracy of the audit trail = Quality of reporting.

Let’s change the focus from  What had been evaluated in critical appraisal instruments? To  What should be the main focus in evaluating the methodological quality in qualitative research.

 Validity and researcher bias should be evaluated  Some Qual. studies are more rigorous than others.  Epistemological and ontological assumptions of Quant. and Qual. research are incompatible  It is inappropriate to use such measures. Validity in Quant. Research  instruments and procedures Validity in Qual. Research  kinds of understanding we have of the phenomena under study (accounts identified by researchers)

 We need to know › whether the set of arguments or the conclusion derived from a study necessarily follows from the premises. › whether it is well grounded in logic or truth. › whether it accurately reflects the concepts, the ideas that it is intended to measure. What is validity?  operationalisation using Maxwell’s framework

 Maxwell’s deconstruction of the concept validity (1992) › Descriptive validity › Interpretative validity › Theoretical validity › Generalisibility (external validity) › Evaluative validity

 The degree to which descriptive information such as events, subjects, setting, time, place are accurately reported (facts rather than interpretation).  Evaluation techniques: Methods- & Investigator triangulation  allows for cross-checking of observations

 The degree to which participants’ viewpoints, thoughts, intentions, and experiences are accurately understood and reported by the researcher.  Evaluation techniques: Display of citations, excerpts, use of multiple analysts (inter-rater agreements), self- reflection of the researcher, (member checking).

 The degree to which a theory or theoretical explanation informing or developed from a research study fits the data and is therefore credible/defensible.  Evaluation techniques: Persistent observation  stable patterns, deviant or disconfirming cases, multiple working hypotheses, theory triangulation and active search for deviant cases, pattern matching

 The degree to which findings can be extended to other persons, times or settings than those directly studied.  Evaluation techniques: Demographics, contextual background information, thick description, replication logic

 The degree to which a certain phenomenon under study is legitimate, degree to which an evaluative critic is applied to the object of study.  Evaluation techniques: Application of an evaluative framework, ethics,...

 The most commonly used instrument ‘CASP’, is the least sensitive to aspects of validity (findings based on screening the main headings). It does not address interpretive nor theoretical validity or context as a criterion. › The theoretical position and the background of a researcher has a direct impact on the interpretation of the findings. › Statements that have no clear link to excerpts are at risk of not being grounded in the data. › Therefore, they should be LEAD CRITERIA in a critical appraisal instrument!

 This study is limited by › Its focus on the main headings of the instrument. Some of the subheadings of CASP do address issues of e.g. interpretive validity and some issues are not addressed in the JBI-tool, e.g. sampling procedures. › its focus on validity as an evaluation criterion. Which other aspects are important to consider in evaluating the quality of an instrument?

 Are there fundamental differences between appraisal instruments?  Do we need to give our choice of appraisal instrument some thought?  Could it assist us in evaluating the (methodological) quality of a study?  Does it help us to establish rigor in qualitative research?

 Checklists only capture what has been reported.  I argue for the use of verification techniques for validity as a means for obtaining rigor.  In evaluating validity at the end of a study (post hoc), rather than focusing on processes of verification during the study we run the risk of missing serious threats to validity until it is too late to correct them.

 Basic qualitative researchers › should be motivated to adopt techniques that improve validity › Should be guided in how to report qualitative research in order to facilitate critical appraisal

 The development of a standards set of reporting criteria for qualitative research is virtually impossible!  The development of a standard set of reporting criteria would facilitate critical appraisal. We might need you! To participate in a Delphi study on exploring the potential feasibility, appropriateness, meaningfulness and effectiveness of reporting criteria.

 To validate is to investigate, to check, to question, and to theorize. All of these activities are integral components of qualitative inquiry that insure rigor (Morse, 2002).  The process of inquiry is where the real verification happens.