GOOD INTENTIONS : USING DATA TO IMPROVE PERFORMANCE 1 Prepared & Presented By: Renata Cobbs Fletcher Consultant, M.H. West & Co. Persistently Dangerous.

Slides:



Advertisements
Similar presentations
POLICY DEVELOPMENT, IMPLEMENTATION and EVALUATION CONFERENCE Sep 2011 Workshop B: Effective Internal Policy facilitated by Nola A. Hennessy, Serenidad.
Advertisements

Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
Results Based Monitoring (RBM)
Jo Ann Lamm, MSW Child Welfare Consultant March 25,
PM&E Participatory Monitoring and Evaluation Prepared by BMCalub.
What You Will Learn From These Sessions
Presented by Alaska Project Solutions, Inc. BIA Providers Conference Helping Grantees Achieve Success.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation Essentials. WHAT is Program Evaluation?
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Chapter 2 DO How can you create a strategic map for your hotel?
Logic Models. A logic model is your program ROAD MAP. Where are you trying to go? How are you trying to get there? What will tell you that you’ve arrived?
Management 11e John Schermerhorn
Challenge Questions How good is our operational management?
Evaluation. Practical Evaluation Michael Quinn Patton.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 3: Engaging stakeholders.
ENTER DATE 2010 • RPG SOLUTIONS • RPGBENEFITS.COM
Managing Up Board Governance from the Staff Perspective © MAP for Nonprofits.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Departmental Learning Assessment Developing Strategies to Assess Value NMLA Annual Conference, April 19, 2013.
Step 6: Implementing Change. Implementing Change Our Roadmap.
Measuring the Value of Your Volunteer Efforts Nikki Russell Volunteer Initiatives Manager United Way of King County.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
SUPPORT FOR YOUR STUDENT EQUITY PLAN Presented by the Institute for Evidence-Based Change October 10th, 2014.
Total Operational Excellence 18/09/ Case Study: Improvement Projects Leila Sarcia- BI Specialist Rio Tinto Group- Energy Resources Aust.
Too expensive Too complicated Too time consuming.
BUSINESS PLUG-IN B15 Project Management.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Carrie E. Markovitz, PhD Program Evaluation: Challenges and Recommendations July 23, 2015.
Developing and Writing Winning Individual, Corporate and Foundation Proposals Robin Heller, Director, Corporate and Foundation Philanthropy, BBBSA Robert.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
2013 NEO Program Monitoring & Evaluation Framework.
Beyond Compliance: Informative Reporting 2008 AmeriCorps Conference.
Why the “Why” Matters: The Impact of Organizational Culture Presentation for: 2012 Fall Conference.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
MGT 1102 Fall 2001 Course Overview Gregory Gull, Ph.D. Office: Bartley 2008 Phone:
Articulating the Value of Consumer Engagement 2 – 3:30 pm ET July 25, 2013.
Monitoring and Evaluation
TRAINING MATERIALS 1 Module 8 Facilitating Change Purpose: Participants have the tools and knowledge to facilitate systemic and sustainable change.
State and Local Collaboration for Coordinated Chronic Disease Prevention: A Qualitative Analysis Alecia Kennedy, MPH, Richard W. Wilson, DHSC, MPH Sue.
A Curriculum for the future The new Secondary Curriculum What’s next? Phase 3.
Evaluation Revisiting what it is... Who are Stakeholders and why do they matter? Dissemination – when, where and how?
Measuring the Impact of Your Volunteer Program Barbra J. Portzline, Ph.D. Liz Benton, MBA.
Catholic Charities Performance and Quality Improvement (PQI)
Are you fit for funding? Angela Richardson Cheshire Community Foundation WARRINGTON THIRD SECTOR ASSEMBLY North West Funders Forum.
NIH Change Management Program Change Management Program Overview March 8,
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Can You Enhance Knowledge and Stimulate Excellence One STEM Unit at a Time? AEA – October 16, 2014 Panel: Evaluating STEM Professional Development Interventions.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Measuring the Results of Your Volunteer Efforts With acknowledgement to Nikki Russell, United Good Neighbors of Jefferson County.
PACFaH M&E TOOLS AND REPORTNG TIMELINES Jayne Arinze-Egemonye.
SUPPORT FOR YOUR STUDENT EQUITY PLAN
Project monitoring and evaluation
Monitoring and Evaluation
Evaluating ESD in RCEs: The Start-up Tools
Civic Engagement in Action: SCORE Joan Rudd
Policy Lessons Learned
Designed for internal training use:
SUPPORT FOR YOUR STUDENT EQUITY PLAN
CATHCA National Conference 2018
Continuing Professional Development Assessor Briefing
Choose your own adventure: standing out with evidence-based practice
Presentation transcript:

GOOD INTENTIONS : USING DATA TO IMPROVE PERFORMANCE 1 Prepared & Presented By: Renata Cobbs Fletcher Consultant, M.H. West & Co. Persistently Dangerous Schools Grantee Conference September 20-21, 2011

PURPOSE OF PRESENTATION To gain a deeper understanding of why data is a critical program management, budget, and sustainability tool To gain deeper knowledge about how data can be used strategically and systematically To learn to love data 2

WHY ARE WE COLLECTING DATA? Friend to Groucho Marx: “Life is difficult!” Marx to Friend: “Compared to what?” 3

WHY ARE WE COLLECTING DATA? 1.What gets measures gets done. 2.If you don’t measure results, you can’t tell success from failure. 3.If you can’t see success, you can’t reward it. 4.If you can’t reward success, you’re probably rewarding failure. 5.If you can’t see success, you can’t learn from it. 6.If you can’t recognize failure, you can’t learn from it. 7.If you can demonstrate results, you can win public support. Adapted from Osborne, D and T. Gaebler. (1992). In Patton, M. Q. (1997). Utilization-Focused Evaluation. (3rd ed.). Thousand Oaks, CA: Sage. 4

OTHER REASONS TO LOVE DATA  Staff and other program stakeholders can be motivated by the results  Even poor results can serve as the basis for seeking expanded funding for increased staffing, bigger facilities, etc. programming  If accurate, It is the truth  It can tell us what we need to do less of or more of  It can tell us what we might not be collecting and should be 5

HATERS: WHY DOES DATA COLLECTION HAVE SO MANY?  We don’t have the time to focus on it  I can’t stand navel gazing  I know that people will make things up to make the program look good  We can tell you now what the data will say  I don’t need data to tell me anything. I know in my heart this program works  The data doesn’t tell the story of the great things we do  I don’t understand the data 6

EXERCISE: ANALYZE YOUR PROGRAM’S DATA  Are your organization’s results to date what you expected? Why or why not?  Can you explain the results? How?  Is there anything you feel is missing from the results?  Can the results help improve the program? How? 7

TIPS FOR USING DATA  Information gleaned from data and research must be operationalized for maximum effectiveness and impact  Data must be disseminated to all partners on a frequent and systematic basis  All partners must be actively engaged in data review, analysis, interpretation 8

TIPS FOR USING DATA  Run internal reports regularly (weekly, bi- weekly, monthly, quarterly, annually)  Convene with staff to review and discuss data reports (weekly, bi-weekly, monthly, quarterly, annually)  Make certain that all staff and program stakeholders know what the benchmarks, goals and outcomes are for the program and for each component 9

TIPS FOR USING DATA  Staff and program stakeholders must all be fully informed about what the goals, benchmarks, and outcomes are for the program and for each component  Staff and program stakeholders must agree to take responsibility for the realization of intended goals, benchmarks and outcomes through concrete, agreed-upon strategies and timelines  Staff and program stakeholders must meet to review and discuss data reports (weekly, bi-weekly, monthly, quarterly, annually) 10

TIPS FOR USING DATA  Primary, secondary and tertiary benefits should be identified (primary outcomes, cost benefits, participants’ self-reported perceptions of positive changes in themselves (depression, happiness, etc.)  Budgets should be regularly reviewed through the lens of the data collected and revised accordingly 11

TIPS FOR USING DATA  Disseminate findings externally and strategically for public relations, marketing, and financial support and sustainability requests to current and future funders 12

CONCLUSION Q&A “One of the great mistakes is to judge policies and programs by their intentions rather than their results” - Milton Friedman 13