Download presentation
Presentation is loading. Please wait.
Published byLoreen Woods Modified over 9 years ago
1
Quality Appraisal of Qualitative Studies Alison Cooke Midwife & NIHR Doctoral Research Fellow ESN Presentation 050214
2
Why appraise quality? A systematic review is a scientific exercise which may influence healthcare provision The quality of a review will be influenced by the methods used to minimise error and bias There is a plethora of published work on how to appraise quality of research Whilst there are some differences in how to appraise, there is a high level of agreement on the key components of quality appraisal
3
What is metasynthesis? “A systematic review of qualitative research” (Booth 2001) “Seeks to develop and refine theories while retaining the uniqueness of individual studies” (Jensen 2004) “The science of summing up” (Light & Pillemer 1984)
4
Types of qualitative synthesis Meta-ethnography (Noblit & Hare) Grounded Theory (Kearney / Eaves / Finfgeld) Thematic Synthesis (Thomas & Harden) Textual Narrative Synthesis (Lucas et al.) Meta-study (Paterson et al.) Meta-narrative (Greenhalgh et al.) Critical Interpretive Synthesis (Dixon-Woods et al.) Ecological Triangulation / Ecological Sentence Synthesis (Banning) Framework Synthesis (Brunton et al. / Oliver et al.)
5
Noblit & Hare Meta-ethnographic approach: to reveal analogies between studies, making sense of what the collection of studies is saying Line-of-argument synthesis; the interpretive synthesis is concerned with inference – what can we say of the whole phenomenon based on selective studies of the parts? Repeated comparison between studies reveal similarities and differences put findings into new interpretive context
6
Quality Appraisal Best tool for assessment is often defined by type of metasynthesis being conducted, e.g. specific 10-point tool for Framework synthesis DIAD tool for ecological triangulation JBI specific 10-point checklist Over 100 tools and frameworks are available Are rigid checklists appropriate? Role of expert judgement? Insufficient evidence to inform judgement on rigour or added value of various approaches (Noyes et al., 2008)
7
Which tool to use? Rigid yes/no checklist Expert review
8
Advantages/Disadvantages Rigid checklist Too restrictive for qualitative appraisal More suited to quantitative appraisal; more objective Validated tools; reader aware of robustness of quality assessment process Expert review Experienced reviewer reads the paper as a whole and provides assessment of quality Not credible; reader not able to assess robustness of quality assessment process Highly subjective
9
Optimum tool Somewhere along the continuum between ‘rigid checklist’ and ‘expert opinion’ Best ‘fits’ the type of metasynthesis being conducted Validated, or has been used many times in the topic area Provides an organised and systematic approach which can be replicated between reviewers, but also by the reader
10
Worked example Quality assessment tool developed by Walsh & Downe; 2005 Grading tool developed by Downe et al. 2009, based on work by Lincoln & Guba, 1985.
11
StagesEssential CriteriaNotes Scope and PurposeClear statement of, and rationale for, research questions/aims/purposes Study thoroughly contextualised by existing literature DesignMethod/design apparent, and consistent with research intent Data collection strategy apparent and appropriate Sampling StrategySample and sampling method appropriate AnalysisAnalytic approach appropriate InterpretationContext described and taken account of in interpretation Clear audit trail given Data used to support interpretation ReflexivityResearcher reflexivity demonstrated Ethical DimensionsDemonstration of sensitivity to ethical concerns Relevance and Transferability Relevance and transferability evident
12
A No, or few flaws. The study credibility, transferability, dependability and confirmability is high. B Some flaws, unlikely to affect the credibility, transferability, dependability and/or confirmability of the study. C Some flaws that may affect the credibility, transferability, dependability and/or confirmability of the study. D Significant flaws that are very likely to affect the credibility, transferability, dependability and/or confirmability of the study. Grading System created by Downe et al based on the work by Lincoln & Guba
13
Activity Read the extract from Dobrzykowski et al., 2003 Look at the explanatory notes for use of relevant sections of assessment tool Complete your quality assessment of this extract
14
Completed assessment: 1 DesignMethod/design apparent, and consistent with research intent Grounded theory: rationale provided. Discussion of rationale for design choice. Setting appears appropriate. Face to face interview, telephone conversation, email (first author). Tape recorded and transcribed verbatim. 2-3 interviews; appears sufficient to capture complexity/diversity. Data collection strategy apparent and appropriate Sampling Strategy Sample and sampling method appropriate n=53 Data and emergent categories dictated. Sample selection process, justification explained. Thickness of description likely.
15
Completed assessment: 2 DesignMethod/design apparent, and consistent with research intent Clear rationale given for the qualitative approach, used grounded theory in order to explore processes and symbolic meanings for study group. In-depth interviews in person, by telephone and follow up interview by email. Data collection strategy apparent and appropriate Sampling Strategy Sample and sampling method appropriate Yes, fully explained.
16
Consensus All 3 reviewers gave a grade A/B for this study (Very few flaws. The study credibility, transferability, dependability and confirmability is high) Important to have a meeting following independent quality assessment to reach consensus over grading and an agreed decision over inclusion and exclusion of studies for synthesis
17
Flaws to determine grading “No evidence of systematic approach to literature review” “Reflexivity unclear: no discussion of researchers influence on participants etc.” “Limitations of study not discussed” “No evidence of ethical approval” “No evidence of member checking” “Unlikely to capture thickness of data” “?Bias: only one researcher involved in data collection and analysis process”
18
Should studies be excluded? Grade D = “Significant flaws that are very likely to affect the credibility, transferability, dependability and/or confirmability of the study” Should we exclude in a metasynthesis any studies found in answer to a research question?
19
Argument to include all? Edwards et al., 1998 Advocate balancing an assessment of methodological quality against the weight of its message – the ‘signal to noise ratio’, rather than excluding studies that fall below a certain quality threshold Booth, 2001 If research is rejected on the basis of design alone there is a high risk of denying valuable insights that contribute to our interpretation of a phenomenon
20
Challenges of qualitative evidence synthesis Literature searching: poor indexing of qualitative studies in databases (use SPIDER! Cooke et al. 2012) Journal word count may mean that points appraised may be badly explained or not included Appraisal techniques are dominated by the quantitative paradigm Structured approaches may not provide greater consistency of judgements to include or exclude Subjective judgement Interpretive, rather than aggregating, intent Time consuming
21
In summary There is no ‘right’ way of conducting quality assessment of qualitative research Having a structured approach can provide the reader with clarity of the decision process used Multiple reviewers required to review papers independently, then meeting to reach a consensus of decision to include/exclude
22
Thank you Any questions? Alison.Cooke@manchester.ac.uk
23
Reference List Banning J. undated. Design and Implementation Assessment Device (DIAD) Version 0.3: A response from a qualitative perspective [Website] Available from: http://mycahs.colostate.edu/James.H.Banning/PDFs/design%20and%20implementation%20assessment% 20device.pdf [Accessed: 03/02/14] http://mycahs.colostate.edu/James.H.Banning/PDFs/design%20and%20implementation%20assessment% 20device.pdf Banning J. undated. Ecological Triangulation [Website] Available from: http://www.mychhs.colostate.edu/James.H.Banning/PDFs/Ecological%20Triangualtion.pdf [Accessed: 03/02/14] http://www.mychhs.colostate.edu/James.H.Banning/PDFs/Ecological%20Triangualtion.pdf Banning J. undated. Ecological Sentence Synthesis [Website] Available from: http://www.mychhs.colostate.edu/James.H.Banning/PDFs/Ecological%20Sentence%20Synthesis.pdf [Accessed: 03/02/14] http://www.mychhs.colostate.edu/James.H.Banning/PDFs/Ecological%20Sentence%20Synthesis.pdf Booth A. 2001. Cochrane or cock-eyed? How should we conduct systematic reviews of qualitative research? Qualitative Evidence-based Practice Conference, Coventry University, May 14-16 Brunton G, Oliver S, et al. 2006. A synthesis of Research Addressing Children’s, Young People’s and Parents’ Views of Walking and Cycling for Transport EPPI-Centre, London Cooke A, Smith D & Booth A. 2012. Beyond PICO: The SPIDER Tool for Qualitative Evidence Synthesis. Qualitative Health Research 22(10) p1435-43 Dixon-Woods M, Cavers D, et al. 2006. Conducting a critical interpretive synthesis of the literature on access to healthcare by vulnerable groups. MBC Medical Research Methods 6(35) Downe S Finlayson K et al. 2009. ‘Weighing up and balancing out’: a meta-synthesis of barriers to antenatal care for marginalised women in high-income countries BJOG 116 p518-529 Eaves YD. 2001. A synthesis technique for grounded theory data analysis. Journal of Advanced Nursing 35 p654-663 Edwards AG, Russell LT & Stott NCH. 1998. Signal versus noise in the evidence base for medicine: an alternative to hierarchies of evidence? Family Practice 15(4) p319-322
24
Reference List Finfgeld D. 1999. Courage as a process of pushing beyond the struggle. Qualitative Health Research 9 p803-814 Greenhalgh T, Robert G, et al. 2005. Storylines of research in diffusion of innovation: a meta-narrative approach to systematic review. Social Science & Medicine 61 p417-430 Jensen LA. 2004. Extending meta-analysis. Qualitative Health Research 14(10) p1346-1347 Kearney MH. 2001. Enduring love: a grounded formal theory of women’s experience of domestic violence. Research in Nursing & Health 24 p270-282 Light RJ & Pillemer DB. 1984. Summing up: the science of reviewing research. Havard University Press, Cambridge, MA Lincoln YS & Guba EG. 1985. Naturalistic Inquiry. Sage Publications, California Lucas PJ, Arai L, et al. 2007. Worked examples of alternative methods for the synthesis of qualitative and quantitative research in systematic reviews. BMC Medical Research Methods 7(4) Noblit GW & Hare RD. 1988. Meta-Ethnography: Synthesizing Qualitative Studies (Qualitative Research Methods Series 11). Sage Publications, California Noyes J, Popay J, et al. 2008. Chapter 20: Qualitative research and Cochrane reviews. In: Higgins JPT & Green S (Eds) Cochrane Handbook for Systematic Reviews of Interventions. John Wiley & Sons, Chichester Oliver S, Rees R, et al. 2008. A multidimensional conceptual framework for analysing public involvement in health services research. Health Expectations 11 p72-84 Paterson BL, Thorne SE, et al. 2001. Meta-study of Qualitative Health Research. A Practical Guide to Meta- Analysis and Meta-Synthesis. Sage Publications, California Thomas J & Harden A. 2008. Methods for the thematic analysis of qualitative research in systematic reviews. BMC Medical Research Methods 8 p45 Walsh D & Downe S. 2006. Appraising the quality of qualitative research. Midwifery 22 p108-19
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.