Three perspectives on outcome measurement

Slides:



Advertisements
Similar presentations
Towards a theory of mental health professionals understandings of psychotic experiences Dr. Clark Davison Highly Specialist Clinical Psychologist SPRIG,
Advertisements

Carrol Gamble Jenny Newman Heather Bagley Bec Hanley.
Indices of Change: Meaningful analysis of outcome measures CORC Members’ Forum - 20 th November 2014 Robbie Newman Adapted from: Comparison of indices.
Every Child in Norfolk Matters A Logical Framework for Change.
Evaluation at The Prince’s Trust Fire Service Prince's Trust Association meeting 18 th February 2010 Subtitle.
HOMELESS HEALTH NEEDS AUDIT OVERVIEW OF THE HOMELESS HEALTH NEEDS AUDIT.
Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
Sure Start Nuneaton Evaluation of Early Years and Childcare Service Sure Start Nuneaton Janet Whitehead Independent Consultant.
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
Public Health in Community Pharmacy
The Graduate Attributes Project: a perspective on early stakeholder engagement Dr Caroline Walker Queen Mary, University of London.
Version 1 | Internal Use Only© Ipsos MORI 1 Version 1| Internal Use Only Sheffield CCG CCG 360 o stakeholder survey 2014 Summary report.
Findings from the Evaluation Dr Alison Carter, IES Associate 11 November 2014.
Implementing service transformation in a recession environment: findings from a qualitative evaluation of Children and Young People IAPT (CYP IAPT) Aris.
Use of OCAN in Crisis Intervention Webinar October, 2014.
Building Research Capacity in social care: An untapped potential? Jo Cooke &Linsay Halladay University of Sheffield Others in the research team: Ruth Bacigalupo.
Promoting good practice in tackling poverty and disadvantage INSET materials for primary schools.
Commissioning Self Analysis and Planning Exercise activity sheets.
Children and Young Peoples’ Participation. Increasingly recognised as a mark of a quality service Belief that this is how ‘transformational change’ can.
Developing Innovative Partnerships to improve Services to Carers Establishing an Evidence Base James Drummond Lead Officer Integrated Carers Services Torbay.
James Lind Alliance Tackling treatment uncertainties together James Lind,
Transforming Patient Experience: The essential guide
Towards a Stronger Partnership: Working with Students with BTEC Entry Qualifications in HE Monica Fernandes and Dr. Jessica Hancock.
Clinical commissioning and the voluntary and community sector Louise Edwards Commissioning Development team.
SDCCG Outcomes Framework Project Workshop 1 (Data Managers and Service Leads)
SDCCG Outcomes Framework Project Workshop 2 (Front Line Staff and Managers)
Introduction 5 th October 2015 David Rhys Wilton Director TPAS Cymru © TPAS Cymru
CAPA in Child and Adolescent Mental Health Services An independent evaluation by the Mental Health Foundation 2009 National CAMHS Support Service.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Youth in Focus. Young people’s voices “ money issues are a key thing for me” “the right kind of support is really important to me” “ forming relationships.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Geraldine Brown, Elizabeth Bos, and Geraldine Brady Methodological and ethical dimensions of health focused research in prisons Symposium, Glasgow, May.
Conclusions  Carers of people with dementia report a higher need for information and holistic, structured support from health and social care services.
Reading Together, Working Together. What is shared reading? Reading aloud, together, for pleasure rather than functional literacy Aimed at people who.
The Role of Genetic Counselors and Genetic Services in Medical Care Maya Rom, Ph.D. Weill Cornell Medical College.
CORC Best Practice Framework: Self-Assessment & Accreditation Tool
An introduction to the SEN&D Pathfinder Evaluation
Implementation of Releasing Time to Care (RTC) with AHPs
Growing Well, Low Sizergh Farm, Low Sizergh, Kendal, Cumbria, LA8 8AE         
Front Line Innovation and Trials
Thursday 2nd of February 2017 College Development Network
Hillingdon CCG CCG 360o stakeholder survey 2014 Summary report.
Factors facilitating academic success: a student perspective
The Health and wellbeing board, local healthwatch, and health scrutiny
Dr Marcello Bertotti Senior Research Fellow
Using data to transform lives: making the most of CORC reports
Between the Flags (BTF) program introduction and resultant changes in nurses’ workplace culture Ms Cathy Maginnis Mrs Maryanne Podham Dr Judith Anderson.
7 Days Patient & Public Engagement:
Liaison Mental Health: Where are we now in the East Midlands
Open All Areas Difficulties met in the process
Mallee Child and Youth Area Partnership Forum 9th September 2015
Margaret Hurley Social Prescribing Development Lead - VAST
York & Selby CAMHS Service Delivery.
Early Years – early language, social mobility and the home learning environment 15 March 2018.
Age Friendly Communities
Anxiety Groups in Primary CAMHS
Progress update Dr Sophie Doswell
RCSLT Outcomes Project RCSLT Wales Hub Day 28th February 2018
Margaret Hurley Social Prescribing Development Lead - VAST
Skills competitions Experiences of past competitors
‘See Me’: Exploring unmet need among young adults in Bristol
Promoting safety through understanding career experiences of Australian men’s behaviour change program practitioners Zoë Bosch.
Primary PE and Sport Premium
Harrow CCG CCG 360o stakeholder survey 2014 Summary report.
The perception and operation of cancer rehabilitation services in South Wales from healthcare professionals’ perspective: a qualitative study Judit Csontos,
Investing in what matters to communities
TLAP Partnership Meeting 7th June 2017
Shared Medical Appointments (SMAs)
Evidence for Enhancement: Students Using Students’ Data
NDA Report - Assessment of Need Part 2 of the Disability Act commenced in 2007 for children under 5 Parents of children have applied to date.
Presentation transcript:

Three perspectives on outcome measurement 22nd November 2017 Three perspectives on outcome measurement Findings from the CORC Evaluation Dr Emily Stapley, Rosa Town & Dr Jessica Deighton

Background to the CORC Evaluation CORC have asked the Evidence Based Practice Unit (EBPU) to conduct an evaluation to explore how their member services and associates experience: Engaging in routine outcome monitoring (ROM) – including outcome measurement, data collation, data analysis, and interpretation of findings Receiving support from the CORC team This includes child mental health and wellbeing services (in both the statutory and voluntary sector) and commissioners The aim is to identify the impact of and barriers/facilitators to engaging in ROM and working with CORC, from stakeholders’ perspectives

Structure of the CORC Evaluation Three waves of activity: What are the experiences of statutory child and adolescent mental health services (CAMHS) of implementing ROM and working with CORC? What are the experiences of voluntary sector child mental health and wellbeing services of implementing ROM and working with CORC? What are the experiences of commissioners in relation to ROM and working with CORC?

7 interviewees (CORC implementers, service managers, clinicians) CORC Evaluation Wave 1 - Methodology Sample: 3 CAMH services 7 interviewees (CORC implementers, service managers, clinicians) Data collection: Structured telephone interviews Average length = 26.03 minutes (range 18.03 - 31.06) Data analysis: Thematic analysis (Braun & Clarke, 2006)

Wave 1 findings - Perceived impact of ROM and CORC on the service CORC Evaluation Wave 1 findings - Perceived impact of ROM and CORC on the service Enhancing clinical practice, e.g. facilitating discussions with clients and provision of a ‘temperature check’ on progress “I really find it useful with some clients to kind of plot where they were at the start and kind of where they are now” (Clinician) Service improvement “The [experience of service] measure flags up things that we’ve actually [taken] on board and then changed” (CORC Implementer) Boosting service confidence and morale “You’re constantly thinking that you’re not up to the state where you should be, whereas when we’re getting these reports and this information from CORC, it’s really given the service a boost” (CORC Implementer)

Wave 1 findings - Perceived barriers to engaging with ROM CORC Evaluation Wave 1 findings - Perceived barriers to engaging with ROM IT system issues, e.g. glitches in services’ IT systems and a mismatch between CORC and service database systems “Our computers at work don’t access the internet very well. They’re very old browsers” (CORC Implementer) Lack of time and staffing capacity “We haven’t been able to keep up to pace with doing lots and lots of outcome measures. The difficulty we have is processing the data on a day-by-day basis, because it involves data entry, data extraction, graphing” (Clinician) Resistance to ROM “I know that we have struggled actually getting staff on board to use ROMs and to see the uses in them . . . however that is getting a lot better” (CORC Implementer)

10 small voluntary sector services CORC Evaluation Wave 2 - Methodology Sample: 10 small voluntary sector services 12 interviewees (CORC implementers, service managers, clinicians) Data collection: Structured telephone interviews Average length = 29.79 minutes (range 14.23 - 41.04) Data analysis: Thematic analysis (Braun & Clarke, 2006)

Wave 2 findings - Perceived impact of ROM and CORC on the service CORC Evaluation Wave 2 findings - Perceived impact of ROM and CORC on the service Areas of impact Factors limiting impact Provision of evidence of service impact Limitations of data collected by service A learning experience Fit between service and outcome measures An impetus to improve data collection Obstacles to data collection “What we wanted to do was show a sort of, a bigger picture of the stuff that we do . . . so that’s kind of what we’ve been using [the goal based outcome measure] with CORC [for]” (Service Manager) “Other services have, they’ll do six sessions . . . and then they do the measurement at the beginning, they’ll do the measurement at the end. We’re much more messy and fluid than that. You know we have young people who come to us for a while and go away, [then] they come back” (Service Manager)

8 clinical commissioning groups (CCG) 8 interviewees (commissioners) CORC Evaluation Wave 3 - Methodology Sample: 8 clinical commissioning groups (CCG) 8 interviewees (commissioners) Data collection: Structured telephone interviews Average length = 28.93 minutes (range 17.13 – 39.45) Data analysis: Thematic analysis (Braun & Clarke, 2006)

CORC Evaluation Wave 3 findings – The use or influence of data from commissioners’ perspectives To facilitate understanding of service provision and impact “[We want to use the CORC report] to look at how well we’re doing, where the gaps are, what’s not so strong, where we need to get better in collecting outcomes reported. A lot of things really. We want to marry that with sort of activity data, so ethnicity as well, look at the demographics around it too” (Commissioner) Making comparisons “[The CORC report] gives me a bit of a sense of where [area] is versus other members of the CORC partnership” (Commissioner) CORC data and reports are one part of the bigger picture around commissioning decisions “[The CORC report is] a starting point for a deeper discussion so it just takes the form of triangulation of information about the quality of the service” (Commissioner)

CORC Evaluation Wave 3 findings – Perceived challenges around ROM and working with data Use of technical language “I think initially [when working with CORC] there was some stumbling around language used . . . so things like ‘funnels’ and ‘funnel plot’, ‘confidence levels’, ‘meaningful statistical data’” (Commissioner) Time and capacity issues for commissioners and providers “The pressures on commissioners these days, particularly in CAMHS actually, are huge” (Commissioner) Services’ difficulties in collecting data “We haven’t had the data systems set up here to get that information to CORC. So it’s been quite, the reports that have come back have lacked a lot of data, a lot of information” (Commissioner)

Methodological limitations: CORC Evaluation Conclusions What are the key messages around outcome measurement from these three groups? Methodological limitations: Transferability of findings to other services and staff members Potential sampling bias What next?

CORC Evaluation References Braun, V. & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3, 77-101.