Mixed methods in health services research: Pitfalls and pragmatism Robyn McDermott Mixed Methods Seminar JCU 16-17 October 2014.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Introduction to Monitoring and Evaluation
Introduction to the unit and mixed methods approaches to research Kerry Hood.
Research article structure: Where can reporting guidelines help? Iveta Simera The EQUATOR Network workshop.
Donald T. Simeon Caribbean Health Research Council
Potential of Public Health Systematic Reviews to Impact on Primary Research Professor Laurence Moore September 2007.
Doug Altman Centre for Statistics in Medicine, Oxford, UK
Planning an improved prevention response up to early childhood Ms. Giovanna Campello UNODC Prevention, Treatment and Rehabilitation Section.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Elements of a clinical trial research protocol
An Introduction to Monitoring and Evaluation for National TB Programs.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
How to make the best of qualitative phases of mixed method research Professor Kim Usher Centre for Chronic Disease Prevention Mixed Methods in Prevention.
“But WHAT did they actually do?” Poor reporting of interventions: a remediable barrier to research translation Associate Professor Tammy
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
How to Develop the Right Research Questions for Program Evaluation
Tobacco Control Interventions – Design Trade-Offs K. S. (Steve) Brown Department of Statistics and Actuarial Science Health Behaviour Research Group University.
Reporting and Using Evaluation Results Presented on 6/18/15.
Centre for University Continuing Education under mandate from the Federal Office of Public Health Universität Bern Evaluation in Public Health – Lessons.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Research Designs for Complex Community Interventions for Childhood Obesity Prevention Robert W. Jeffery, Ph.D. Division of Epidemiology University of Minnesota.
Topic 4 How organisations promote quality care Codes of Practice
Dr. Tracey Bywater Dr. Judy Hutchings The Incredible Years (IY) Programmes: Programmes for children, teachers & parents were developed by Professor Webster-Stratton,
How qualitative research contributes to evaluation Professor Alicia O’Cathain ScHARR University of Sheffield 22 June 2015.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Evaluation Assists with allocating resources what is working how things can work better.
Contribution Analysis: An introduction Anita Morrison Scottish Government.
Program Fidelity Influencing Training Program Functioning and Effectiveness Cheryl J. Woods, CSW.
HSRU is funded by the Chief Scientist Office of the Scottish Government Health Directorates. The author accepts full responsibility for this talk. Health.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Barbara Resnick, PhD, CRNP, FAAN, FAANP
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
STANDARDS OF EVIDENCE FOR INFORMING DECISIONS ON CHOOSING AMONG ALTERNATIVE APPROACHES TO PROVIDING RH/FP SERVICES Ian Askew, Population Council July 30,
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Cleanliness Champions: Evaluation of impact on HAI in NHSScotland Professor Jacqui Reilly HPS.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Dr Joseph Obe Dr Joe Website: FB: Joseph Obe.
Research article structure: Where can reporting guidelines help? Iveta Simera The EQUATOR Network workshop 10 October 2012, Freiburg, Germany.
Design of Clinical Research Studies ASAP Session by: Robert McCarter, ScD Dir. Biostatistics and Informatics, CNMC
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
Building a logic model of reinforced models of palliative care Anke Rohwer, Andrew Booth, Lisa Pfadenhauer, Louise Brereton, Ansgar Gerhardus, Kati Mozygemba,
Key Disparities Research Questions Marshall H. Chin, MD, MPH Associate Professor of Medicine University of Chicago Director, RWJF Finding Answers: Disparities.
What IE is not and is not Male Circumcision Impact Evaluation Meeting Johannesburg, South Africa January 18-23, 2010 Nancy Padian UC Berkeley.
The best start at home: what does the evidence say?
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Dementia Care - a Forward View and a note on the Nottinghamshire Healthcare Strategy Professor Martin Orrell Director, Institute of Mental Health 1.
Critical Realism and Realist Synthesis Sam Porter School of Nursing and Midwifery March 2016.
Program Planning for Evidence-based Health Programs.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Technical Assistance on Evaluating SDGs: Leave No One Behind
Intervention Development in Elderly Adults (IDEA)
Evaluation of Nutrition-Sensitive Programs*
What can implementation research offer?
2016 Prioritized Research Gaps – PTSD
Dissemination and Implementation Research: Past, Present, and Future (One View) Russell E. Glasgow, PhD Department of Family Medicine, Univ. Colorado.
Practice- How to Present the Evidence
HMS Academy Fellowship in Medical Education Research June 2, 2016
Challenging operating environments (COEs)
Monitoring and Evaluation
Service Array Assessment and Planning Purposes
H676 Week 5 - Plan for Today Review your project and coding to date
Professor Deborah Baker
STROBE Statement revision
Program Planning: Models and Theories
Nancy Padian UC Berkeley
Presentation transcript:

Mixed methods in health services research: Pitfalls and pragmatism Robyn McDermott Mixed Methods Seminar JCU October 2014

What’s special about health services (and much public health) research? Interventions are complex Settings are complex Standard control groups may not be feasible/ethical/acceptable to services and/or communities “Contamination” is a problem Unmeasured bias/confounding Secular behaviour and policy change over time can be strong, sudden and unpredictable Context is very important but often poorly described Example of the DCP

“Improving reporting quality” checklists CONSORT: RCTs with updates for cluster RCTs TREND: Transparent Reporting of Evaluations with Non-randomised Designs (focused on HIV studies initially) PRISMA: Reporting systematic reviews of RCTs STROBE: Reporting of observational studies MOOSE: Reporting systematic reviews of observations studies

Complex interventions Review of RCTs reported over a decade Less than 50% had sufficient detail of the intervention to enable replication (Glasziou, 2008) Even fewer had a theoretical framework or logic model Systematic reviews of complex interventions often find small if any effects, or contradictory findings. This may be due to conflating studies without taking account of the underlying theory for the intervention (eg Segal,2012: Early childhood interventions)

TREND has a 22-item checklist Item 4: Details of the interventions intended for each study condition and how and when they were actually administered, specifically including: Content: what was given? Delivery method: how was the content given? Unit of delivery: how were the subjects grouped during delivery? Deliverer: who delivered the intervention? Setting: where was the intervention delivered? Exposure quantity and duration: how many sessions or episodes or events were intended to be delivered? How long were they intended to last? Time span: how long was it intended to take to deliver the intervention to each unit? Activities to increase compliance or adherence (e.g., incentives)

Suggestions for improvements to TREND and CONSORT Armstrong et al, J Public Health, 2008 Introduction: Intervention model and theory Methods: Justify study design choice (eg compromise between internal validity and complexity and constraints of the setting) Results: Integrity (or fidelity) of the intervention Context, differential effects and multi-level processes Sustainability: For public health interventions, beyond the life of the trial

Theoretical framework and logic model for an intervention effect (should be in the introduction- example from the Diabetes Care Project – DCP, )

Study design choice ( methods section) Strengths and weaknesses of the chosen study design Operationalization of the design including: – Group allocation, – Choice of counterfactual, – Choice of outcome measures, and – Measurement methods

Implementation Fidelity Part of process evaluation Information on the Intensity, Duration and Reach of the intervention components and, If and how these varied by subgroup (and how to interpret this)

Effectiveness will vary by Context Context elements can include Host organization and staff System effects (eg funding model, use of IT, chronic care model for service delivery) Target population

Multilevel processes Informed by Theoretical Model used eg Ottowa Charter framework for prevention effectiveness studies may involve analysis of Individual level data Community level data Jurisdictional level data Country level data

Differential effects and Sub-group analysis Counter to the RCT orthodoxy of effectiveness trials there may be value in looking at differential effects by SES, gender, ethnicity, geography, service model (eg CCHS) Even when there is insufficient statistical power in individual studies Potential advantage is the possibility of a pooled analysis of studies eg by SES impact

Sustainability Beyond the life of the trial (follow up typically very short) Important for policy But not for your journal publication Sustainability research may require separate study design and conduct

And finally….. How do you put all this together and stay in journal word limits? For briefings For journals For reports which will realistically get read?