Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Overview of Performance Measurement. Learning Objectives By the end of the module, you will be able to: Describe what performance measurement is, and.
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
Performance Measurement: Not just for breakfast anymore Performance Measurement and the Strategic Plan Senior Corps Virtual Conference August 2012.
1 Managing to Outcomes Creating a Result-Driven Culture September 15, 2010 Jewel Bazilio-Bellegarde, CNCS Ken Terao and Anna Marie Schmidt, Project STAR.
Performance Measurement Overview 2/8/2014 Performance Measurement Overview 1 What is it? Why do it?
Alignment and Quality 2/8/2014 Alignment and Quality 1 What is it? What do I need to look for?
MONITORING OF SUBGRANTEES
Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Goals-Based Evaluation (GBE)
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Ray C. Rist The World Bank Washington, D.C.
Data Collection* Presenter: Octavia Kuransky, MSP.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
CA Performance Measurement Worksheet (PMW)
IHS Special Diabetes Program Competitive Grants Part 2: Refining Idea Maps Diabetes Prevention Planning Cynthia C. Phillips, Ph.D. Lisa Wyatt Knowlton,
Evaluation. Practical Evaluation Michael Quinn Patton.
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
How to Develop the Right Research Questions for Program Evaluation
2014 AmeriCorps External Reviewer Training
Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training.
Reporting and Using Evaluation Results Presented on 6/18/15.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
2015 AmeriCorps Logic Model A presentation for AmeriCorps grant applicants.
The Evaluation Plan.
Performance Measurement Overview 9/18/2015 Performance Measurement Overview 1 What is it? Why do it?
Data Quality Review: Best Practices Sarah Yue, Program Officer Jim Stone, Senior Program and Project Specialist.
Ten years of Evaluability at the IDB Yuri Soares, Alejandro Pardo, Veronica Gonzalez and Sixto Aquino Paris, 16 November, 2010.
Planning for Success Module Eight. Reflecting on the Previous Session What was most useful from the previous session? What progress have you made since.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
MARKETS II M&E FRAMEWORK AND CHALLENGES Joseph Obado.
Program Assessment Training September 29, Learning Objectives By participating in this session, you will develop a better understanding of: how.
Tell the Story: Why Performance Measures Matter Cat Keen, M.S.W., M.B.A National Service Programs Director.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
2016 AmeriCorps State: Grant Development October 15-30, 2015 Marisa Petreccia, Kate Pisano & Nancy Stetter.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
High Quality Performance Measures Demonstrating the Impact of Service.
Logic Models: Laying the Groundwork for a Comprehensive Evaluation Office of Special Education Programs Courtney Brown, Ph.D. Center for Evaluation & Education.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Kathy Corbiere Service Delivery and Performance Commission
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
Building Bridges: Embedding outcome evaluation in national and state TA delivery Ella Taylor Diane Haynes John Killoran Sarah Beaird August 1, 2006.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Outcomes Measurement in Service Enriched Housing: What it is, Why it Matters, & How to do it Joelle K. Greene, Ph.D. Director of Community Assessment &
Session 2: Developing a Comprehensive M&E Work Plan.
Development of Gender Sensitive M&E: Tools and Strategies.
Instructional Leadership Supporting Common Assessments.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Logic Models How to Integrate Data Collection into your Everyday Work.
2016 AmeriCorps Texas All-Grantee Meeting February 25-26, 2016
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
OGB Partner Advocacy Workshop 18th & 19th March 2010
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick Gianelli, Grants Specialist

Session Objectives  Review how CV uses performance measures  Share best practices in performance measure design  Share best practices in implementation and data collection  Explore how CV assess performance measure review

PM Overview ….systematic process for measuring program outputs and outcomes. Outputs: Amount of service provided (people served, products created, or programs developed) Outcomes: Changes or benefits that occur as the result of the intervention Can reflect changes in individuals, organizations, communities, or the environment Typically include changes in attitudes/beliefs, knowledge/skills, behavior, or conditions Must have a logical connection to the intervention and be aligned with outputs

Why Measure Performance?  Recognition of progress; reflects change Collect reliable information about the intervention’s implementation and progress toward outcome  Accountability to funders and stakeholders Communicate achievements in a meaningful and compelling way  Program Improvement Spot and correct problems Strengthen the intervention Determine technical assistance needs Determine where to allocate limited resources

How CV Uses Performance Measures PMWs trumps the program narrative in the contract CA PMWs:  Tell the story of individual and collective impact of AmeriCorps programs  Serve as the “blue print” for understanding a program’s allowable member activities and intended output and outcome targets  Serve as a monitory tool for CV to assess individual program progress against the awarded program design and theory of change

Best Practices: Performance Measure Design  Select PMs that fit your program design and theory of change, not vice versa  Less = more: focus on a small number of high-quality measures  Measure outputs and outcomes for program beneficiaries  Clearly define all terms used  All PMW elements are logically aligned with the program’s theory of change (community need, proposed interventions, and desired outcomes)  For longer-term outcomes, set targets that are achievable in a single grant year

Best Practices: Performance Measure Design (cont.)  Use numerical targets, not percentages  For outcomes that require participant follow-up, set targets that take into account response rate and attrition  Clearly distinguish outcomes from outputs while maintaining logical alignment  Choose outcome measures that are ambitious but realistic; ensure that the program can realistically document or track the required information

Best Practices: Performance Measure Implementation & Data Collection  Set up MOUs with service sites that clearly lay out data collection responsibilities/expectations  Provide up-front training in data collection for members, site supervisors, and other program staff  Obtain baseline data so that changes can be objectively assessed, rather than assessing perceptions of change retroactively  Select data collection instruments that are valid (measure what they are supposed to measure) and reliable (yield consistent results)  Keep data collection procedures consistent over time and across different sites

Best Practices: Performance Measure Implementation & Data Collection (cont.)  Choose data collection instruments that are accessible and yield timely data  Develop creative ways to improve beneficiary responses to data collection efforts (incentives, etc.)  Allocate sufficient resources toward data collection efforts: money, time, personnel  Build in time for data review and verification prior to compiling/submitting reports  Incorporate data quality review protocols into monitoring visits to sites  Share best practices between programs and site partners

YOUR TURN…Small Group Activity Using the CV Performance Measure Review Checklist, 1.Take 10 minutes to individually assess the performance measure against the criteria assigned to your table. 2.Take 5 minutes to share your analysis with participants at your table. 3.Select a reporter to share your group’s analysis to the larger group.

Additional Resources Visit available on-line performance measurement courses on the CNCS Knowledge Network:  How to Use the CNCS National Performance Measure Instructions  Building Evidence of Effectiveness  Collecting High Quality Outcome Data, Part 1  Collecting High Quality Outcome Data, Part 2  Designing Effective Action for Change  High Quality Performance Measurement  Overview of Performance Measurement Link:

Questions?

THANK YOU!