YSGOL GWYDDORAU GOFAL IECHYD / SCHOOL OF HEALTHCARE SCIENCES

Slides:



Advertisements
Similar presentations
Introduction to the unit and mixed methods approaches to research Kerry Hood.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
NIHR Research Design Service London Enabling Better Research Forming a research team Victoria Cornelius, PhD Senior Lecturer in Medical Statistics Deputy.
Opening the "black box" of PDSA cycles: Achieving a scientific and pragmatic approach to improving patient care Chris McNicholas, Professor Derek Bell,
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
BALANCING EFFICIENCY AND EQUITY A NEW INTERNATIONAL RESEARCH PROGRAMME ADDRESSING THE ROLE OF VALUES IN HEALTH CARE Department of Primary Care and Public.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
Lynn Stockley & Associates Introduction to Behavioural Change Lynn Stockley.
Information and Communication Technology Research Initiative Supporting the self management of obesity: The role of ICTs University.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
2013.  Established 2007;  One of the three Prevention and Early Intervention Programme Initiatives; “ We were set up with the objective of testing innovative.
Implementing A Safe Resident Handling Program in Nursing Homes.
18 th to 21 st June 2013 Primary Care Sciences Keele University RUNNING RANDOMISED CLINICAL TRIALS For further enquiries contact Debbie Cooke Tel: +44(0)1782.
1 October, 2005 Activities and Activity Director Guidance Training (F248) §483.15(f)(l), and (F249) §483.15(f)(2)
How qualitative research contributes to evaluation Professor Alicia O’Cathain ScHARR University of Sheffield 22 June 2015.
Very brief interventions to promote physical activity in primary care: A feasibility study Funder: NIHR Programme Grant Sponsors: University of Cambridge.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
A cluster randomised controlled trial to assess the effectiveness of knowledge translation strategies for obesity prevention Elizabeth Waters, Boyd Swinburn,
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Introduction to Case Management. Why Case Management ?  The context of care is changing; we now have an ageing population and an increase in chronic.
PHSB 612: Interventions Diane M. Dowdy, Ph.D. Spring 2008.
NIHR Research Design Service London Enabling Better Research Dr Caroline Burgess General Adviser 13 th November 2013.
Today.. Overview of my realist synthesis Reflections on the process
Workshop A. Development of complex interventions Rob Anderson, PCMD Nicky Britten, PCMD.
Improving skills and care standards in the support workforce: a realist synthesis of workforce development interventions Jo Rycroft-Malone, Christopher.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
1 CHRONIC CONDITION SELF-MANAGEMENT FLINDERS HUMAN BEHAVIOUR & HEALTH RESEARCH UNIT THE FLINDERS MODEL.
HTA Efficient Study Designs Peter Davidson Head of HTA at NETSCC.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Critical Realism and Realist Synthesis Sam Porter School of Nursing and Midwifery March 2016.
Critical Realist Evaluation
Chapter 23: Overview of the Occupational Therapy Process and Outcomes
Session 10 Innovative Adaptation and Dissemination of AHRQ Comparative Effectiveness Research Products (iADAPT): Developing Strategies for Implementation.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Framework for Working with People whose behaviour is perceived as challenging Pre-Registration Learning Disability Field UK
admissions in residents in care homes.
Introduction Social ecological approach to behavior change
Title of the Change Project
Title of the Change Project
Incorporating Evaluation into a Clinical Project
Program Evaluation ED 740 Study Team Project Program Evaluation
Development and feasibility testing of a complex intervention
RUNNING RANDOMISED CLINICAL TRIALS
Health Education THeories
QIC-AG Logic Model Template
Intervention Development in Elderly Adults (IDEA)
Realist evaluation Dr Geoff Wong Clinical Research Fellow
Assess Plan Do Review Resource 1: Types of Evaluation – which type of approach should I use? There are a number of different types of evaluation, the most.
Health Education England Workforce Strategy - Key Points
What can implementation research offer?
Amanda Lilley-Kelly Senior Trial Co-ordinator
Professor Stephen Pilling PhD
Peer Element of ODDESSI
Evaluation of the Tower Hamlets Together (THT) vanguard programme Mirza Lalani University College London.
Tit Albreht | Brussels | 7 November 2017
Health Education England Workforce Strategy - Key Points
Study within a Trial (SWAT) to increase the evidence for trial recruitment and retention in decision making -Shaun Treweek From the UK Trial Managers.
Public Health Intelligence Adviser
An Introduction to the NIHR programmes
Narrowing the evaluation gap
ImpleMentAll Midterm Workshop
Introduction to Quality Improvement Methods
Name of presentation 28 April, 2019
Building Capacity for Quality Improvement A National Approach
OGB Partner Advocacy Workshop 18th & 19th March 2010
Nadine Hendrie Dr Catherine Marchand Dr Grant McGeechan
Program Planning: Models and Theories
Presentation transcript:

YSGOL GWYDDORAU GOFAL IECHYD / SCHOOL OF HEALTHCARE SCIENCES PRIFYSGOL BANGOR / BANGOR UNIVERSITY Lynne Williams, Bangor University www.bangor.ac.uk/healthcaresciences @SHSBangor

 How do we assess how and why interventions achieve (or not) outcomes?  Are there differences between expected and observed outcomes?  How does context influence implementation and outcomes?  What’s the causal pathway for the intervention?  What are the mechanisms? Craig P, Dieppe P, MacIntyre S et al (2008). Developing and Evaluating Complex Interventions: New Guidance. London: MRC. Raine R, Fitzpatrick R, Barratt H et al (2016). Challenges, solutions and future directions in the evaluation of service innovations in health care and public health. Health Services and Delivery Research NIHR Vol 4 (16).

 To explain “for whom, how and why a complex intervention had a particular impact” (Liu et al, 2016)  To open the “black box” to examine what lies beneath (the “underlying processes”) (Grant et al, 2013)  Show how an intervention is implemented and what was delivered (as compared with what was intended to be delivered) http://evaluation.lshtm.ac.uk/process-evaluation/  Can provide a “snapshot” of receipt, implementation and context of trial intervention (Audrey et al, 2006) Audrey S, Holliday J, Parry-Langdon N et al (2006). Meeting the challenges of implementing process evaluation within randomized controlled trials: the example of ASSIST (A Stop Smoking in Schools Trial). Health Education Research 21 (3) 366-377 Grant A, Treweek S, Dreischulte T et al (2013). Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting. Trials 14:15 Liu H, Muhunthan J, Hayek A et al (2016). Examining the use of process evaluations of randomised controlled trials of complex interventions addressing chronic disease in primary health care—a systematic review protocol Systematic Reviews 5:138

 Process Evaluation runs parallel with the trial to understand the processes in relation to the intervention background, setting, and people involved  A logic model represents the intervention, and describes the resources required to ensure the implementation of the intervention, the components of the intervention, the mechanisms of impact, and the intended outcomes (UK Medical Research Council (MRC) Guidance)  Where implementation within process evaluation is described, this often refers to “the implementation, receipt and setting of an intervention and help in the interpretation of outcomes” (Oakley et al, 2006).

Understanding the context Engagement with policy makers, practitioners, patients Understanding the outcomes Acknowledging the social world we live in What are the causal mechanisms in the programme?

 Explanatory, contingent theory development  Context may be defined as space or place, whereby human interaction takes place under the “appropriate social and cultural conditions” (Pawson & Tilley, 1997: 57)  “change as the results of actions of social agents operating in a specific context whereby the action leads to outcomes by triggering mechanisms” http://evaluation.lshtm.ac.uk/process- evaluation/#realist

Context Mechanisms Outcomes Changes (intended & unintended), resulting from an intervention Context Outcomes Mechanisms What leads to response? How we respond in what ways and why? Features which are likely to affect how, and for whom, a programme is expected to work?

Sackley CM, Walker MF, Burton CR et al PRIFYSGOL BANGOR / BANGOR UNIVERSITY An Occupational Therapy intervention for residents with stroke-related disabilities in UK Care Homes (OTCH): cluster randomised controlled trial with economic evaluation Sackley CM, Walker MF, Burton CR et al The Health Technology Assessment 2016; Vol. 20: No. 15 DOI: 10.3310/hta20150 Pragmatic Phase III, parallel-group, cluster randomised controlled trial with an economic evaluation Primary outcome: The primary outcome measure was the Barthel Index (BI) score at 3 months after randomisation. The BI assesses dependency in 10 categories of self-care Aim: The predominant aim was to perform a definitive evaluation of OT for stroke and transient ischaemic attack (TIA) survivors in long-term institutional care Task related interventions on activities of daily living (feeding, dressing, toileting, bathing, transferring, mobilising) Supply of adaptive equipment Seating and positioning assessment Care home staff training

Example of the realist approach as used to advance learning about process evaluation function Process evaluation examining the fidelity of the occupational therapy intervention for residents a programme theory of fidelity to underpin the process evaluation multiple methods of data collection (e.g. in-depth interviews and critical incident reports) Four potential mechanisms through which fidelity within the trial could be investigated; the balancing of research and professional requirements building a positive rapport with care home staff working at re-engineering care homes’ environment learning about the intervention and its impacts over time

Process evaluations alongside trials of complex interventions can illuminate the underlying processes Provide a picture of the receipt, implementation and context of trial intervention Drawing on principles of the realist approach can guide process evaluation in order to identify what works, for whom and in what contexts