Identifying, Monitoring, and Assessing Promising Innovation: Using Evaluation to Support Rapid Cycle Change July 18 2011 Presentation at a Meeting sponsored.

Slides:



Advertisements
Similar presentations
MDGs and MTEF Sudharshan Canagarajah World Bank. Background MDGs are requiring additional efforts in improving planning, budgeting and policy reforms.
Advertisements

THE COMMONWEALTH FUND Developing Innovative Payment Approaches: Finding the Path to High Performance Stuart Guterman Vice President, Payment and System.
Deposit insurance in the European Union José María Roldán | 13 Oct 2005.
Program Goals, Objectives and Performance Indicators A guide for grant and program development 3/2/2014 | Illinois Criminal Justice Information Authority.
Instructional Decision Making
Future directions for CHAs Benchmarking Member Service.
Social inclusion initiatives: the effect of joined up approaches Justine McNamara and Alicia Payne Paper presented at the 11 th Australian Institute of.
Review of Maternal and Child Health Service
The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
Donald T. Simeon Caribbean Health Research Council
Evaluating the Impacts of Classroom Assessment Initiatives: Benefits and Potential Approaches Third Annual Black Sea Conference, September 2014 Batumi,
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
Introduction to the User’s Guide for Developing a Protocol for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research.
Shared Decision-making’s Place in Health Care Reform Peter V. Lee Executive Director National Health Care Policy, PBGH Co-Chair, Consumer-Purchaser Disclosure.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
THE COMMONWEALTH FUND Developing Innovative Payment Approaches: Finding the Path to High Performance Stuart Guterman Assistant Vice President and Director,
Improvement Service / Scottish Centre for Regeneration Project: Embedding an Outcomes Approach in Community Regeneration & Tackling Poverty Effectively.
Spending Public Money Wisely Scaling-Up Educational Interventions Barbara Schneider John A. Hannah University Distinguished Professor College of Education.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
Measuring Disability in a Survey or Census Context: Parallel Work Advancing the Field Barbara M. Altman, Ph.D. Disability Statistics Consultant.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Purpose of the Standards
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
ORC TA: Medicare Rural Hospital Flexibility Grant Program HRSA U.S. Department of Health & Human Services Health Resources & Services Administration.
How to Develop the Right Research Questions for Program Evaluation
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
The County Health Rankings & Roadmaps Take Action Cycle.
Performance Measurement and Analysis for Health Organizations
THE COMMONWEALTH FUND Developing Innovative Payment Approaches: Finding the Path to High Performance Stuart Guterman Assistant Vice President and Director,
Medical Audit.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Demonstrating Effectiveness Background and Context.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
SIM Evaluation Approach Presentation to the SIM Steering Committee September 25, 2013.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Cancer Mortality Target Measuring and Monitoring at a National Level Jennifer Benjamin, Department of Health Kathy Elliott, National Cancer Action Team.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Collecting and Using Data ROMA Data for Management and Accountability.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Or How to Gain and Sustain a Competitive Advantage for Your Sales Team Key’s to Consistently High Performing Sales Organizations © by David R. Barnes Jr.
Coaching PBIS Implementation Coaching PBIS Implementation South Dakota PBIS Team sdpbis.wikispaces.com AND….Rob Horner ~ org.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
BEST PRACTICE PORTAL BEST PRACTICE PORTAL project presentation to the Scientific Committee Ferri et al Lisbon, 16th July 2010.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Torbay Council Partnerships Review August PricewaterhouseCoopers LLP Date Page 2 Torbay Council Partnerships Background The Audit Commission defines.
Pay for performance initiatives a growing trend, but are they effective? Josephine Borghi London School of Hygiene & Tropical Medicine Ifakara Health Institute.
PRAGMATIC Study Designs: Elderly Cancer Trials
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Introduction and Overview
Stages of Research and Development
Salford’s Market Position Statement
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Introduction and Overview
Perspectives on Reform of Publicly Funded Training
Public Health Intelligence Adviser
Tracie Wills Senior Commissioning Officer
EM&V Planning and EM&V Issues
M & E Plans and Frameworks
Presentation transcript:

Identifying, Monitoring, and Assessing Promising Innovation: Using Evaluation to Support Rapid Cycle Change July Presentation at a Meeting sponsored by the Alliance for Health Reform Marsha Gold Based on a paper by Gold, Helms, and Guterman for The Commonwealth Fund

Focusing on change that matters (agenda setting) Documenting innovations to support effective learning and spread (formative feedback and process analysis) Balancing flexibility and speed with rigor in developing evidence to support policy change (outcomes evaluation) Issues Addressed in Paper 2

Focusing on Change That Matters 3 $10 billion isnt that much money Is there potential for a large impact on joint quality/cost metric? Large gains over small populations or small gains over large populations Plausibility, back-of-the-envelope estimates of potential given numbers, demographics, and range of likely effects

Documenting and Learning from Innovation I 4 What does the innovation seek to achieve and how? Logic models, defined measures for success Tracking what was implemented and when versus what was planned Timely measurement and feedback to innovatorson metrics that matter to them

Efficiency: investing in shared metrics and approaches for cross-site learning –Characteristics of innovations –Characteristics of context –Common metrics of success Realistic expectations: implementation always takes longer than expected and more so if the context is complex Minimize barriers that slow or drain momentum Documenting and Learning from Innovation II 5

HHS has authority, but CMS actuary must certify that change wont add to program costs (and for the secretary, that it has demonstrated the potential to improve quality) What will/should be the standard of evidence? Evidence to Support Broad Program Change 6

Careful definition of target population for the purpose of judging success One or more comparison groups otherwise similar to serve as benchmark Metrics typically constructed from centralized data files, existing or new Long time frame to distinguish short-term effects from stable long-term effects Historical Approaches to Evaluation 7

National demonstrations across widely divergent organizations Bottom up innovation with variation in detail across sites Contaminated comparison groups Desire for rapid feedback on right direction even though some effects may take time to surface. Likely Reality of Innovation Testing 8

Trade-offs between type 1 and type 2 errors Moving too quickly versus too slowly on improvements: how good are things now, what risks are there in change? Congressional history: Legislators have acted before evaluations are done. They have also failed to act despite evaluation results showing what was or was not successful. Realism on Standards of Evidence 9

Demonstrating the feasibility of implementation is an important and potentially powerful outcome Utility of pilots enhanced by high-quality evidence that: –Clarifies in advance what is to be tested and why –Provides ongoing, consistent, and timely measures of intended and unintended changes – Includes appropriate analysis to aid in attribution –Gives enough information on context and implementation to allow suitable and spread to be assessed by diverse stakeholders Conclusions 10

Trade-off between rigor and rigor mortis Distinguish useful initiatives to propagate from efforts that mainly preserve the status quo or are actually harmful Avoid stifling innovation to improve system because no data are good enough Appropriate trade-offs likely to vary across innovations at different stages or with different risks/rewards Ultimate Policy/Research Challenge 11

Mathematica ® is a registered trademark of Mathematica Policy Research. Download Gold, Helms, and Guterman paper at Marsha Gold at or For More Information 12