TOPIC WORKSHOP: DATA AND MEASUREMENT Identifying, defining and collecting measures from PDSAs Early Years Collaborative: Learning Session 5.

Slides:



Advertisements
Similar presentations
E ARLY Y EARS C OLLABORATIVE FALKIRK Home Team Event 31 st March pm – 5pm St Bernadette’s RC Primary School.
Advertisements

October 2012 National Quality Framework Assessment & rating overview and authorised officer training Podcast Series: 4.
Department of Human Services Plan Do Study Act (PDSA) Workforce Capacity Building to support ICDM Barbara Whyte Integrated Care Branch Department of Health.
Closing the progress gap. Key issues addressed by the study This study explored: – approaches to closing the gap for disadvantaged pupils –effective leadership.
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
Vickie Robb Heart of Missouri RPDC 2 London Hall Columbia, MO Rob Gordon, Ed.D. Heart of Missouri RPDC 2 London Hall Columbia,
ImmPact Reports Danielle Hall
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Quality Improvement Methods Greg Randolph, MD, MPH.
BREAKOUT 1: Identifying the Gap (or Journey) (13.45 – 15.00)
Reading in the EYFS Wednesday 11 th February 2015.
Measures Definition Workshop
Coaching for School Improvement: A Guide for Coaches and Their Supervisors An Overview and Brief Tour Karen Laba Indistar® Summit September 2, 2010.
Measurement for Improvement. Hello & Welcome: what are you here for?
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Table of Contents. Lessons 1. Medical Mathematics Go Go 2. Number Basics Go Go.
Session 8: Assessing Changes in Social Norms Girls Not Brides Workshop on the Theory of Change and Measuring Impact.
The Policy Context Caroline Bennett, Council for Disabled Children.
Slide 1 Plan−Do−Study−Act! Using the PDSA Cycle to Improve Your Performance Improvement Projects March 18, 2014 Presenter: Christi Melendez, RN, CPHQ Associate.
Early Help Hub Update June 2015
The Model for Improvement Dannie Currie SIA for the SHN Atlantic Node.
Measurement for Improvement. Turn to your neighbor What have been your biggest learnings or challenges regarding data gathering and measurement for your.
[Facility Name] [Presenter Name] [Date]. Objectives 2 After this session, you will be able to 1. describe Root Cause Analysis (RCA) and Plan-Do-Study-Act.
Part I – Data Collection and Measurement Ruth S. Gubernick, MPH Quality Improvement Advisor Lori Morawski, MPH CHES Manager, Quality Improvement Programs.
‘Accelerated Learning is not only about faster learning, but learning which is deeper and more effective.’ (Charlick, 2004)
School Readiness: We’re Better Together
Chapter 5: Elementary Probability Theory
Writing Workshop 2 Leah Burns Phone:
A METHOD TO OUR MADNESS: OUR THEORY OF HOW IMPROVEMENT HAPPENS Ninon Lewis and David Williams Institute for Healthcare Improvement Early Years Collaborative:
CHOLLA HIGH MAGNET SCHOOL Plc Workshop
Early Childhood Outcomes Center Data Workshop Getting the Tools You Need for Data-Informed Program Improvement Presented at the OSEP National Early Childhood.
Planning your lessons for more Student Talking Time Teachers’ Seminar 2011.
Leadership Council Retreat August 21, 2014 New Mentor Orientation Anchoring Our Work with DATA.
1 Breakout 3 Patterning and Algebra Responding to Student Mathematical Thinking.
Inequalities Collaborative Ealing and Hounslow 1 Awale Kullane.
I have no relevant financial relationships with the manufacturers of any commercial products and/or provider of commercial services discussed in this CME.
GOVERNOR’S EARLY CHILDHOOD ADVISORY COUNCIL (ECAC) September 9, 2014.
{. { Session 3 – Money and Dates 11 th May 2015 Dipti Patel.
An Introduction to Improvement. To develop a basic understanding of: Service improvement in healthcare Some of the tools and techniques used To apply.
District Meeting #3 Year One East Bay BTSA Induction Consortium
Measuring Improvement & Building A Measurement Plan Quality Academy Cohort 6 Residency 1 April 2013 Melanie Rathgeber, MERGE Consulting.
Blended Practices for Teaching Young Children in Inclusive Settings Jennifer Grisham-Brown, Ed.D. Mary Louise Hemmeter, Ph.D.
DEVELOPING EXCELLENCE TOGETHER Bite-size training Personal, Social and Emotional Development: Making relationships.
Data Processing & Data Quality. A virtuous cycle The implicit assumptions underlying information systems are twofold: first, that good data, once available,
DEVELOPING EXCELLENCE TOGETHER Bite-size training Personal, social and emotional development: Managing feelings and behaviour.
Healthcare Quality Improvement Dr. Nishan Sharma University of Calgary, Canada March
Healthcare Quality Improvement Dr. Nishan Sharma University of Calgary, Canada October
Chapter 9 Review How can you measure employee engagement levels over time?
Summary of Action Period 1 TN Patient Safety Collaborative: Reducing Physical Restraints Learning Session 2 April 7, 8 & 9 th, 2009.
Milana Svedic Statistics Prof. T.Cann 6Dec2011. Are The Younger Children More Likely to be The Victims of The Child Abuse than The Older Children?
Summary of Action Period 2 TN Patient Safety Collaborative: Reducing Physical Restraints Learning Session 3 October 6, 7 & 8 th, 2009.
BREAKOUT 1: Identifying the Gap (or Journey) (13.45 – 15.00)
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Session 5: Selecting and Operationalizing Indicators.
Recording Care – The Challenge PACE Pilot January 2016.
1 2 Assessing Pupils’ Progress Spring term 2009.
Measurement for Improvement
Individual Treatment Plan Goal Writing
Health and Social Care Alliance Scotland GIRFEC Workshop
Tapestry.
Why do we need to be outcome focused?
Supporting Continuous Quality Improvement in Family Child Care: A Peer Learning Group Session 5: Improving Outcomes for Children in Family Child Care:
Measurement Strategy Project Topic: ____________________________ Date: _____________ Primary Driver (Which primary driver does this measure fall into?)
Integrating Outcomes Learning Community Call February 8, 2012
Quality Improvement Indicators and Targets
Measurement Strategy Project Topic: ____________________________ Date: _____________ Primary Driver (Which primary driver does this measure fall into?)
Building QI capability
Presentation transcript:

TOPIC WORKSHOP: DATA AND MEASUREMENT Identifying, defining and collecting measures from PDSAs Early Years Collaborative: Learning Session 5

The aim of any improvement project is to make services and processes better. That might be: safer (less errors) more effective (using evidence) more efficient (less waste) more person-centred (fitting with family requests) equitable timely

Measurement for improvement asks questions like: What does "better" look like? How will we recognise better when we see it? How do we know if a change is an improvement?

Discuss with the colleagues at your table what “better” would look like for the improvement project you’re working on.

Without change we won't make improvements. Without measurement we won't know if we have improved.

In order to plan and implement measurement for improvement you need a measurement strategy (including a measurement plan).

AIM (How good? By when?) Concept Measures Operational Definitions Data Collection Plan Data Collection Analysis PDSA The Quality Measurement Journey Source: R. Lloyd. Quality Health Care: A Guide to Developing and Using Indicators. Jones and Bartlett, 2004.

AIM (How good? By when?) Concept Measures Operational Definitions Data Collection Plan Data Collection Analysis PDSA The Quality Measurement Journey Source: R. Lloyd. Quality Health Care: A Guide to Developing and Using Indicators. Jones and Bartlett, Our focus for today

# MEASURE NAME Type Measure Description Reporting Start Date Reporting Frequency Category Numerator Name Numerator Description Denominator Name Denominator Description Sampling Plan

Design an initial Measurement Plan There are three objectives at this stage: Turn the concepts of quality into actual measureables (counts, values, percentages or rates) Define each measure (for each measure a clear and unambiguous description, in quantifiable terms, of what to measure and the steps to measure it consistently) Develop a data collection plan

Turn the concepts of quality into actual measureables (counts, values, percentages or rates) What will we count or measure to be able to demonstrate improvement? Number Percentage Rate Days between Cases between

Turn the concepts of quality into actual measureables (counts, values, percentages or rates) Do we have a balanced family of the vital few measures including: Process measures Outcome measures Balancing measures

# MEASURE NAME Type Measure Description Reporting Start Date Reporting Frequency Category Numerator Name Numerator Description Denominator Name Denominator Description Sampling Plan Count Percent Rate per 100 Rate per 1000 Days between Cases between Daily Weekly Monthly Quarterly Yearly Outcome Process Balancing

# MEASURE NAME Type Measure Description Reporting Start Date Reporting Frequency Category Numerator Name Numerator Description Denominator Name Denominator Description Sampling Plan Uptake of month child health review Percent Percentage of children who turned up for their child health review 9 June 2014 Weekly Process Number of children who had a review Count of the number of children who had a month child health review completed each week Total number of children who should have had a review Count of the total number of children who were expected to have a month child health review during each week All children each week Or Random sample of 5 children each week

Table Exercise What are the vital few “measures” that are useful for the improvement project you’re working on?

Feedback from Tables

Thank you for a great session!