Developing measures to manage quality and safety in integrated care in New Zealand Tom Love 14 November 2013.

Slides:



Advertisements
Similar presentations
2012 EXAMINER TRAINING Examples of NERD Comment Formatting
Advertisements

Key Stage 3 National Strategy
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Internal Audit Capability Model (IA-CM) for the Public Sector
Managing the Statutory Requirements for Assessment April 2011.
Tools for Change Plan, Do, Study, Act The PDSA Cycle Explained
School Based Assessment and Reporting Unit Curriculum Directorate
Why and How should we focus on Results? Susan Stout, Manager Results Secretariat OPCS November, 2006.
Using New Technologies and Approaches Pamela Bigart World Bank.
Transforming Health Care in Ontario HLA#2 Meeting May 17, 2013.
1 Leading Change through Strategic Planning Ralph J. Jasparro, Ph.D.
Donald T. Simeon Caribbean Health Research Council
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk –
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
World Health Organization
Patient Centered Care Model The model which was drawn from NMH’s Henderson Framework for Nursing Practice proposes to provide a healing environment centered.
HR Manager – HR Business Partners Role Description
Introduction to the User’s Guide for Developing a Protocol for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research.
Quality Data for a Healthy Nation by Mary H. Stanfill, RHIA, CCS, CCS-P.
Formative assessment of the Engineering Design process
PBL Post-Project Review. 1. Student Engagement2. Project Idea3. Student Learning4. Authenticity of Project Tasks and Products5. Quality and Use of Driving.
Exhibit 5.1: Many Ways to Create Internal Structure
Chapter 15 Evaluation.
Kupu Taurangi Hauora o Aotearoa. Health and Disability Consumer Representative Training MODULE ONE The New Zealand health and disability context.
Questions from a patient or carer perspective
1. RECENT PERFORMANCE AND CAPACITY TO DRIVE PROGRESS Recent data Areas to considerExample questions Red Green Is the school on trajectory? Is attendance.
Chapter 2 A Strategy for the Appraisal of Public Sector Investments.
Session 5 How have others taken action? Preventing Amphetamine-Type-Stimulant (ATS) Use Among Young People A UNODC Training Workshop.
What should be the basis of
Baldwin-Whitehall School District
Quality Improvement Prepeared By Dr: Manal Moussa.
National Frameworks of Qualifications, and the UK Experience Dr Robin Humphrey Director of Research Postgraduate Training Faculty of Humanities and Social.
SMART Goal Setting. Introduction Goal Setting Exercise Identify 4-5 Key Goals/Responsibilities for 2012:
‘IPE the future: What next for CAIPE?’ Liz Anderson and Frances Gordon.
Embedding Research in Practice Judy Lawrence RD PhD
Introduction to ISO New and modified requirements.
S/W Project Management Software Process Models. Objectives To understand  Software process and process models, including the main characteristics of.
Setting SMART Goals.
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
Performance Measurement and Analysis for Health Organizations
Chapter 4 Performance Management and Appraisal
Military Family Services Program Participant Survey Training Presentation.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Thinking About Functions & Conceptual Discrimination Dr Rodney Dormer Victoria University of Wellington.
Revalidation for SAS doctors John Bache FRCS RST Associate NHS Revalidation Support Team SASG Annual Conference Manchester 13th January 2010.
Intensive Therapeutic Service A joint initiative by: Berry Street Victoria & the Austin CAMHS In partnership with La Trobe University Faculty of Health.
NHS Education for Scotland Defining A Quality Improvement Framework For A Coordinated Service Model Workshop 27 th May 2003 Dr Ann Wales NHS Scotland Library.
Goal and Target Setting - What’s my role? Module 3.
Process Improvement. It is not necessary to change. Survival is not mandatory. »W. Edwards Deming Both change and stability are fundamental to process.
Module 3. Session Clinical Audit Prepared by J Moorman.
© 2010 Health Information Management: Concepts, Principles, and Practice Chapter 5: Data and Information Management.
Process Improvement. It is not necessary to change. Survival is not mandatory. »W. Edwards Deming.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
Unit 1: Health IT Teams Examples and Characteristics Component 17/ Unit 11 Health IT Workforce Curriculum Version 1.0/Fall 2010.
Helping Teachers Help All Students: The Imperative for High-Quality Professional Development Report of the Maryland Teacher Professional Development Advisory.
Introducing Personal and Social Capability. Victorian Curriculum F–10 Released in September 2015 as a central component of the Education State Provides.
LISA A. KELLER UNIVERSITY OF MASSACHUSETTS AMHERST Statistical Issues in Growth Modeling.
Using Standards Aligned System to Ensure 21 st Century Teaching and Learning Institute Pennsylvania Department of Education Upper Dublin School District.
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
Chapter 5 Population Health Quality and Safety Learning Objectives 1. Explain why it is difficult to monitor healthcare quality and safety at the population.
Wait Time Project Implementation Strategy. Implementation Plan: Goals 1.To educate and provide clarification around the wait time project, wait time definitions,
Chapter 23: Overview of the Occupational Therapy Process and Outcomes
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
How to show your social value – reporting outcomes & impact
Chapter 33 Introduction to the Nursing Process
What is a Learning Collaborative?
Introducing Personal and Social Capability
An Integrated Decision Making Process for Children with Complex Needs
Presentation transcript:

Developing measures to manage quality and safety in integrated care in New Zealand Tom Love 14 November 2013

Why are you measuring stuff? Quality improvement? Performance management? What do you want to do with measures? Peer review and professional development? Service development and integration? Are measures linked to incentives/sanctions? For whom? Linked to public information and reporting? Linked to access to services?

Measuring performance: some general points Different kinds of activity have different kinds of characteristics for observing performance; There are different inherent limitations to what you can observe for different kinds of activity.

Typology of activity Type of activity OutputsOutcomesExample ProductionObservable Tasks are generally repetitive and stable, although some of the skills may be specialised. Eg. tax collection: the activity involved in revenue collection is directly observable, and the outcome is easily measured in total quantity of taxation collected. ProceduralObservable Not observable Skills are specialised, tasks may be stable, but outcomes are unique or are much delayed from the activity. Eg. army in peacetime. Training and capability activities are observable, but there is no way of establishing an outcome measure until a war happens. Craft Difficult to observe Observable A general set of skills are applied to unique tasks, with stable similar outcomes. Eg. audit. Specifying and monitoring every detail of investigative audit activity is difficult and complex, but the outcome in terms of audit results is easy to observe. CopingNot observable Generic skills are applied to unique tasks, but outcomes can’t be evaluated in the absence of alternatives. Success is often attained by trial and error. Eg. police maintaining order. The application of effort is complex and difficult to specify in advance, and there is not an alternative in the real world against which to assess the outcome. 4

Key points Outcomes are intellectually desirable measures, but there is often good reason why you can’t or shouldn’t measure them; The nature of knowledge and evidence around health care activity is highly variable, constantly changing, and often disputed; Services are often provided to an individual, and it is hard to know what the counterfactual would have been; At the individual level, numbers can be too small to provide robust statistics. 5

Back to why measuring? Both of: Quality improvement Performance management In different ways at different levels Decentralised: quality improvement among professionals, both individually and in multidisciplinary teams; Central view: performance management across the system and its components: the system isn’t working well unless the whole system is working well. 6

We don’t live in a vacuum Lots of quality improvement goes on throughout the sector But we suspect that clinical governance may be patchy, done better in some places than in others A fair bit of performance management goes on throughout the sector But we suspect that it could be better aligned to quality improvement and integration 7

So…. Need to: observe systems and processes across the complex system; have good information across the system; encourage reflection and learning across the system. While: promoting individual professional participation in safety and quality improvement activities; and collaborative development of integrated services in a way which improves a number of things, including quality and safety. 8

What to measure, how and who does it? 9

What to measure, and who does it? System measures Responsibility across the District for elements of those measures. A system isn’t good unless all of its components contribute Nationally defined Based upon HQSC Triple Aim, with capacity/capability element An impetus towards integration Contributory measures for improvement Largely locally determined (from a menu, which provides some definition and advise about how to manage the relevant information) Improvement measures should be chosen to contribute towards the goals captured in the system measures, but should reflect local priorities for doing so. 10

Constructing measures System measures Small number, nationally determined Several life-cycle based, some capability based Composite measures Mixture of outcomes and outputs Used to assess performance of systems at District, and potentially regional level Improvement measures Large number, locally chosen through alliances Largely process/activity measures for local quality improvement Some elements of quality assurance: RNZCGP Foundation Standards Alliances are the engine 11

Examples of measures System measure: Supported end of life Composite of: Percentage deaths in usual place of residence Average number of hospital days in the last six months of life Average number of urgent ambulance transfers in the last six months of life Contributory measures: Number of advanced care directives in place Pain intensity quantified Plan of care for pain Aperients/laxatives initiated in patients on opioids Polypharmacy 12

What do we expect to happen? Promote integration through: Clear articulation of what constitutes good health care across the system; Joint accountability across all participants of the system for achieving good services; Promote quality improvement through: Building on alliance structures to identify local priorities and areas for quality improvement and service development; Consistent expectations about capacity and capability of quality improvement activities; 13

Squaring the circle: Addressing the dilemma of: Undertaking performance measurement… While supporting quality improvement… While encouraging local collaboration In a complex environment with limited direct levers for control. There is no perfect answer to achieving these different goals at the same time, but there are better and worse trade-offs between them. 14