Using Data to Assess Quality Improvement Results February 23, 2009 Presentation at the CLP/ADRC 2010 Annual Meeting Debra J. Lipson.

Slides:



Advertisements
Similar presentations
Results Based Monitoring (RBM)
Advertisements

Introduction to Impact Assessment
Introduction to Monitoring and Evaluation
Donald T. Simeon Caribbean Health Research Council
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Semonti Basu PBS Technical Assistance Facilitator Grace Martino-Brewster PBS Specialist Austin Independent School District Connecting Data Dots Using data-based.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Maine SIM Evaluation: Presentation to Steering Committee December 10, 2014.
Objectives and Indicators for MCH Programs MCH in Developing Countries January 25, 2011.
Role of Result-based Monitoring in Result-based Management (RBM) Setting goals and objectives Reporting to Parliament RBM of projects, programs and.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
+ Monitoring, Learning & Evaluation Questions or problems during the webinar?
Illinois Medical Home Project Presented by Kathy Sanabria, MBA, PMP Project Director.
Assessing Financial Education: A Practitioner’s Guide December 2010.
Objectives and Indicators for MCH Programs
Molly Chamberlin, Ph.D. Indiana Youth Institute
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop the Right Research Questions for Program Evaluation
Evaluation: A Necessity for the Profession, an Essential Support for Practitioners Linking Intervention with Outcome Bryan Hiebert University of Victoria.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
Key Performance Measures, Evaluation Plans, and Work Plan
JCAHO UPDATE June The Bureau of Primary Health Care is continuing to encourage Community Health Centers to be JCAHO accredited. JCAHO’s new focus.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Unit 10. Monitoring and evaluation
Overview of the Plain Talk Data Collection System Sarabeth Shreffler, MPH, CHES Program Officer, Plain Talk Program Public/Private Ventures.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Qualis Health Nursing Home Quality Care Collaborative [name of nursing home] Team Storyboard July 2015.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Targeted and Intensive Interventions: Assessing Process (Fidelity) Cynthia M. Anderson, PhD University of Oregon.
SUB-MODULE 5. MEANS OF VERIFICATION RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
Indicators for ACSM.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Improving Services to Aboriginal and Torres Strait Islander Peoples: The ABCD Extension Project.
Why Do State and Federal Programs Require a Needs Assessment?
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Community Planning Training 5- Community Planning Training 5-1.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Brightening Oral Health: Teaching and Implementing Oral Health Risk Assessments in Pediatric Care QuIIN Members Multiple studies document that the development.
111 CINDI PMER Workshop March, 2007 By Simon Opolot © 2007 Office of the Premier, KwaZulu-Natal Province, Private Bag X9037 Telecom House, 2nd Floor,
Katherine Perdomo Alanna Pugliese CATEGORIES OF EVALUATION.
What is randomization and how does it solve the causality problem? 2.3.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Texas Homeless Education Office The University of Texas at Austin Charles A. Dana Center 2901 N IH 35,
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Program Evaluation Principles and Applications PAS 2010.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Session 2: Developing a Comprehensive M&E Work Plan.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
Development of Gender Sensitive M&E: Tools and Strategies.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
A Hierarchy for Targeting Outcomes and Evaluating Their Achievement S. Kay Rockwell Professor and Evaluation Specialist Agricultural.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Evaluating the Quality and Impact of Community Benefit Programs
Monitoring and Evaluation: A Review of Terms
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
IX- PREPARING THE BUDGET
OGB Partner Advocacy Workshop 18th & 19th March 2010
Monitoring and Evaluation: A Review of Terms
Presentation transcript:

Using Data to Assess Quality Improvement Results February 23, 2009 Presentation at the CLP/ADRC 2010 Annual Meeting Debra J. Lipson

 Provide basic guidelines for collecting & analyzing data for quality improvement (QI)  Explain similarities & differences in data and measures for quality improvement and program evaluation –Illustrate with person centered hospital discharge planning (PC-HDP) program Objectives 2

 If possible, make use of existing data: –Program records –Routine, ongoing surveys –Hospital discharge data – But existing data must: –Have information you need to assess the results of QI –Distinguish program participants from non-participants –Be available – data access? privacy issues? Collecting data for QI 3

 If existing data not available or suitable, develop new data collection tools that: –Are feasible, e.g. not too costly or burdensome for program staff to administer and record data –Use questions in national surveys  Collect data from all participants –Those with whom you try the QI plan –Not a sample or select group Collecting Data for QI (2) 4

 Assemble/collect data before QI intervention (“baseline”) –Over a period of time, e.g. at least a year –Consider whether program planning may affect program results even before implementation  Check that data is consistently reported –By different program staff –Across program sites Collecting QI data (3) 5

 Generate reports from existing/new data: –Regularly (weekly/monthly) –Little time lag after QI “do” phase (so you can study!)  Compare to baseline and, if relevant, to intermediate targets  Disaggregate by participant characteristics to discern patterns –Medicaid, other insurance –Age –Location –Availability of informal support Review & analyze QI data 6

 Are we carrying out activities as planned? –Level /quantity? –Frequency? –Intensity?  If multiple program sites or providers: –Trends/patterns by type or level of activity?  Are activities producing expected results? –If not, are resources adequate? –How should activities be modified or changed? Useful questions in reviewing QI data 7

Quality improvementProgram evaluation Are we doing the activities we said we would with the grant funds? Are the activities producing expected results/making progress towards goals? Did the program cause the outcomes or impact? Why and how did the program help (or not) achieve the outcomes/impact? Is our PC-HDP program giving participants the information and resources they need to transition home? Did the PC-HDP program shift state or regional post-hospital care use/spending patterns towards HCBS (from SNFs)? Quality improvement vs. program evaluation 8

Staff, funds, organiza- tional, &community resources Actions, process es, tools, events Products of activities (counts) Changes in participant behaviors, knowledge, health status, function Changes in organiza- tions, communit- ies or systems Types of data and measures 9 Inputs Activities Outputs Outcomes Impact Logic model = sequence of activities thought to bring about change and how these activities are linked to the results (outputs, outcomes and impact) the program expects to achieve (“If this happens, then....”) Quality Improvement Program Evaluation Process Measures Outcome Measures Aim

PC-Hospital Discharge Program Measurable targets Data3-month performance Activities Organize and conduct PC-HDP training programs for hospital discharge planners Each year, hold 6 workshops (1 at each of 6 hospitals); total of 24 discharge planners Program recordsHeld 3 workshops, with 12 discharge planners Outputs Workshop participants score at least 90% on test of knowledge about PC-HDP tools and HCBS resources Participant tests (before and after) Before and after tests of knowledge showed 50% improvement, but average test score is 75% Outcomes 10% annual increase in Medicaid beneficiaries discharged to home Hospital discharge data for program participants and non- participants Hardly any change for program participants (no surprise) Using QI to assess PC-HDP progress 10

If you do not know how an outcome relates to a process of care or service delivery, you cannot know what to do to achieve the outcome. QI data tells you whether you are taking the right steps to achieve the outcomes. 11

 Questions?  Contact information: 12