How qualitative research contributes to evaluation Professor Alicia O’Cathain ScHARR University of Sheffield 22 June 2015.

Slides:



Advertisements
Similar presentations
Using Communications for Development 19 May 2006.
Advertisements

Introduction to the unit and mixed methods approaches to research Kerry Hood.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
INEE Conflict Sensitive Education Pack Photo by Stacy Hughes ©
TRANSFORMING EDUCATION THROUGH EVIDENCE. The Centre for Effective Education SCHOOL OF Education Conducting Educational Randomised Control Trials in Disadvantaged.
Early stages of research programmes: pilot and feasibility studies From Scoping to Activity Rehabilitation Conference A Byrne.
Evaluating health informatics projects Reasons for and problems of evaluation Objective model Subjective model.
Donald T. Simeon Caribbean Health Research Council
NIHR Research Design Service London Enabling Better Research Forming a research team Victoria Cornelius, PhD Senior Lecturer in Medical Statistics Deputy.
Potential of Public Health Systematic Reviews to Impact on Primary Research Professor Laurence Moore September 2007.
Mixed methods in health services research: Pitfalls and pragmatism Robyn McDermott Mixed Methods Seminar JCU October 2014.
‘Cohort multiple RCT’ design workshop Clare Relton & Jon Nicholl School of Health and Related Research (ScHARR) Faculty of Medicine University of Sheffield.
Information and Communication Technology Research Initiative Supporting the self management of obesity: The role of ICTs University.
How to make the best of qualitative phases of mixed method research Professor Kim Usher Centre for Chronic Disease Prevention Mixed Methods in Prevention.
ARQ part II data management Training pack 2: Monitoring drug abuse for policy and practice.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
IPhVWP Polish Presidency, Warsaw October 6 th 2011 Almath Spooner Irish Medicines Board Monitoring the outcome of risk minimisation activities.
Discussion Gitanjali Batmanabane MD PhD. Do you look like this?
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Insert name of presentation on Master Slide Professor Sue Lister Introducing quality improvement into the education of health and social care professionals.
ACJRD 16 th Annual Conference 4 th October  2007: Prevention and Early Intervention Programme, funded by DYCA and The Atlantic Philanthropies;
Research Methods in Psychology (Pp 1-31). Research Studies Pay particular attention to research studies cited throughout your textbook(s) as you prepare.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Medically Complex Patients (MCP) Webinar Theories of Change Onil Bhattacharyya, MD, PhD Frigon-Blau Chair in Family Medicine Research, Women’s College.
CArers of people with Dementia: Empowerment and Efficacy via Education (CAD: E 3 ) A multi-disciplinary study of the impact of educational interventions.
“Opening the Doors of Policy-Making: Central Asia and South Caucasus” (UDF- GLO ) Skills Development Training for CSOs Istanbul, June 2-3, 2011 In-depth.
IPhVWP Polish Presidency, Warsaw October 6 th 2011 Almath Spooner Irish Medicines Board Monitoring the outcome of risk minimisation activities.
Criteria for Assessing The Feasibility of RCTs. RCTs in Social Science: York September 2006 Today’s Headlines: “Drugs education is not working” “ having.
Barbara Resnick, PhD, CRNP, FAAN, FAANP
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
The Impact of Health Coaching
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
Why Trials managers are Important Peter Davidson Director NETSCC, HTA.
Randomised Controlled Trials: What, why and how? Pam Hanley 22 March 2013.
St. John’s, Antigua May What is STAP? In 1994, the GEF Instrument sets up STAP – “UNEP shall establish, in consultation with UNDP and the World.
Evaluation design and implementation Puja Myles
National Workshop 3 Luwero, Uganda March 2015.
INEE Guidance Note on Conflict Sensitive Education Location, Date, 2013.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
HTA Efficient Study Designs Peter Davidson Head of HTA at NETSCC.
Parallel Session A - Prevention Proposed List of Minimum Quality Standards.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Stuart Murray Age Concern Wigan Borough Elaine Jones & Joan Brogden Volunteer Researchers Wigan Borough.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information.
Introduction Social ecological approach to behavior change
Evaluation What is evaluation?
Source: Patton, p1.. Program Evaluation: The systematic use of empirical information to assess and improve programs or policies.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Stages of Research and Development
Overview of Intervention Mapping
Module 2 The SDG Agenda: No-one left behind
Front Line Innovation and Trials
Food and Agriculture Organization of the United Nations
Developing Sustainable Behaviour Change Training
Participatory Action Research (PAR)
Assess Plan Do Review Resource 1: Types of Evaluation – which type of approach should I use? There are a number of different types of evaluation, the most.
Conducting Efficacy Trials
What can implementation research offer?
YSGOL GWYDDORAU GOFAL IECHYD / SCHOOL OF HEALTHCARE SCIENCES
Dr Peter Groves MD FRCP Consultant Cardiologist
Preventable drug side effects
Building a Strong Outcome Portfolio
Anna Gaughan Centre for Local Governance 26th March 2008
Takeyourplace.ac.uk Using data to address ethical challenges in RCTs: activity tracking and process focus Sonia Ilie & Silvia Lucato-Hadeler.
Presentation transcript:

How qualitative research contributes to evaluation Professor Alicia O’Cathain ScHARR University of Sheffield 22 June 2015

What is qualitative research? Normal in evaluation Understanding not measuring Set of methods – Focus groups – Semi-structured or in depth interviews – Non-participant observation – Diaries

What is evaluation? Researcher led evaluation Policy evaluation

Researcher-led MRC Framework developing an evaluating complex interventions ACTIF programme – 5 years – RCT – Qualitative at each phase

O’Cathain A, Thomas KJ, Drabble SJ, Rudolph A, Hewison J. What can qualitative research do for randomised controlled trials? A systematic mapping review. BMJ Open 2013;3:e Intervention Trial design and conduct Outcomes Measures Health conditions

Intervention n=254 Develop n=48 Describe it n=10 Understand how it works n=23 Value and benefits n=42 Acceptability in principle n=32 Feasibility and acceptability n=83 Fidelity, reach and dose n=12 Implementation in real world n=4

Trial design and conduct n=54 RecruitmentDiversity Participation in trials Acceptability in principle Acceptability in practice Ethics/informed consent Adapting to local circumstances Impact on staff, researchers, patients

Potential value BiasAvoidance of measurement bias EfficiencyFaster recruitment Saves money EthicsTrials sensitive to human beings Improved informed consent ImplementationFacilitates replicability of intervention in the real world Facilitates transferability of findings in the real world InterpretationExplains trial findings Relevance Ensures interventions meet the needs of health professionals and patients SuccessMakes a trial successful, feasible, viable ValidityImproves internal validity Improves external validity

Maximising value… 1. Do it early – 28% pre-trial Intervention development 100% Acceptability of intervention in principle 25% Acceptability of intervention in practice 24% Recruitment 18% Breadth of outcomes 0% …otherwise its about future trials

2. Publish learning for specific trial or future trials 3. Think beyond interviews: non-participant observation 4. Try iterative or dynamic or participatory approaches at feasibility phase 5. Not just complex interventions 38% of 104 data extracted were drugs or devices 6. Think about the range of work

Problems with quantitative only Null RCTs….explain findings (context, mechanisms of action, implementation) Failed trials….prevent this at pilot stage It works but what is ‘it’? …..qualitative can fix

Policy evaluation Learning from early adopters (feasibility) Stakeholder reception (acceptability) Service delivered (implementation, workforce)

Useful but challenges remain – Fast evaluation – When to evaluate – Moving target – Replacement of difficult-to-measure outcomes with understanding of processes

Conclusions Useful contribution no matter what type of evaluation – essential due to complexity Can help to fix problems faced in researcher- led evaluation Challenges in policy evaluation need reflection