PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Evaluating and Institutionalizing
Strategic Management & Planning
Knowledge transfer to policy makers (with apologies to John Lavis!)
Donald T. Simeon Caribbean Health Research Council
A Presentation on the Management and Curriculum Audit for the Guam Public School System April 14, 2009.
What You Will Learn From These Sessions
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 State Monitoring Under IDEA A Snapshot of Past Practices.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
Secondary Data in Marketing Research External market data – trend analysis, competitive information, industry trends and leaders External Customer data.
1 Introduction to Workforce Planning and Development in State of Alaska Executive Branch Departments.
Workforce Planning Training for Supervisors Presentation Subtitle/Description Presenter’s Name Date.
Performance Appraisal System Update
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
1 14. Project closure n An information system project must be administratively closed once its product is successfully delivered to the customer. n A failed.
Evaluation. Practical Evaluation Michael Quinn Patton.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Internal Auditing and Outsourcing
Eric R. Johnson Hillsborough County, (Tampa) FL
The Federal Accounting Standards Advisory Board Recent FASAB Research on Managerial Cost Accounting in the Federal Government AGA’s 59 th Annual Professional.
 Planning provides the foundation for conservation district programs and operations.  The planning process broadly defines the vision of the future.
Los Angeles County Evaluation System Accomplishments and Next Steps toward Outcomes and Performance Measurement of Treatment Services in LA County 2008.
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
Advocacy.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
700: Agency Culture: Promoting and Supporting Critical Thinking Through Psychological Safety and Accountability.
Homelessness Services in Nipissing District [DNSSAB’s Role] Presentation for the Board of Directors October 10, 2007 Bob Barraclough, Director of Operations.
Communicating Information about the Initiative to Gain Support from Key Audiences.
What is Project Management? How does it affect how you do your job?
OPERATIONAL PLAN OVERVIEW Office of Planning and Budget Division of Administration State of Louisiana October 2006.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Creating a Shared Vision Model. What is a Shared Vision Model? A “Shared Vision” model is a collective view of a water resources system developed by managers.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
American Community Survey (ACS) Program Review Webinar March 6, 2012.
1 Unit 1 Information for management. 2 Introduction Decision-making is the primary role of the management function. The manager’s decision will depend.
FIRST STEP for Success Texas AgriLife Extension Service New Employee Development.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Copyright © 2014 by The University of Kansas Communicating Information about the Initiative to Gain Support from Key Audiences.
FACILITATOR Prof. Dr. Mohammad Majid Mahmood Art of Leadership & Motivation HRM – 760 Lecture - 25.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
The University of Kentucky Program Review Process for Administrative Units April 18 & 20, 2006 JoLynn Noe, Assistant Director Office of Assessment
Chapter Fourteen Communicating the Research Results and Managing Marketing Research Chapter Fourteen.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Evaluation Revisiting what it is... Who are Stakeholders and why do they matter? Dissemination – when, where and how?
Measuring the Impact of Your Volunteer Program Barbra J. Portzline, Ph.D. Liz Benton, MBA.
Kathy Corbiere Service Delivery and Performance Commission
Tax Administration Diagnostic Assessment Tool MODULE 11 “POA 9: ACCOUNTABILITY AND TRANSPARENCY”
1 Washington State Department of Social & Health Services DOP Legislative Preparation Seminar Legislative Bill Analysis December 5, 2006 Christine M. Swanson.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
~ pertemuan 4 ~ Oleh: Ir. Abdul Hayat, MTI 20-Mar-2009 [Abdul Hayat, [4]Project Integration Management, Semester Genap 2008/2009] 1 PROJECT INTEGRATION.
10 W EEK P ROJECT S COPE OF S ERVICES March 13th, 2014.
Info-Tech Research Group1 Manage the IT Portfolio World Class Operations - Impact Workshop.
Strategic Workforce Interventions Andrew L. Reitz Barbara Schmitt Child Welfare League of America.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
CHB Conference 2007 Planning for and Promoting Healthy Communities Roles and Responsibilities of Community Health Boards Presented by Carla Anglehart Director,
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Internal Audit Quality Assessment Guide
Continuous Quality Improvement Basics Created by Michigan’s Campaign to End Homelessness Statewide Training Workgroup 2010.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
Strategic planning A Tool to Promote Organizational Effectiveness
SAMPLE Develop a Comprehensive Competency Framework
Monitoring and Evaluation using the
Module 3 Part 2 Developing and Implementing a QI Plan: Planning and Execution Adapted from: The Health Resources and Services Administration (HRSA) Quality.
ESF evaluation partnership
Presentation transcript:

PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results

Ad Hoc Evaluations Most evaluations do not have an immediate, concrete, and observable effect on specific decisions and program practices.

Ad Hoc Evaluations The most typical impact is one in which the evaluation findings provide additional pieces of information in the difficult puzzle of program action, permitting some reduction in uncertainty. Evaluation results are often absorbed quickly into an organization’s culture, changing program reality, but usually not the operation of the program.

Ad Hoc Evaluations Five ways to increase use of evaluation results. –Develop realistic recommendations that focus on program improvement. –Explore multiple uses of study data. –Constantly remind decision makers of findings and recommendations. –Share findings and recommendations with broad audiences. –Assign evaluation staff to assist in implementing recommendations.

Ad Hoc Evaluations Develop realistic recommendations that focus on program improvement. –Decision makers want to improve programs. They appreciate focused evaluations that provide realistic options based on a systematic and independent assessment of programs. –One approach: complete a thorough analysis of the first 10 percent of the findings. Often this analysis provides a valid basis for developing a set potential recommendations for program change. Reviewed informally by as many stakeholders as possible.

Ad Hoc Evaluations Explore multiple uses of study data. –Evaluators often limit the use of evaluation data to the questions of hypotheses under investigation. –Backup data should be made available in summary tables at the end of the report for use by different audiences for different reasons. –The act of sharing data creates goodwill and maximizes the use of the evaluation effort. –Example: Michigan department of social services demographic program profiles of clients broken down by counties. –Creates potential for unintended program changes.

Ad Hoc Evaluations Constantly remind decision makers of findings and recommendations. –Write agency newsletter articles describing the findings, the recommendations, and successful implementation of changes to educate the members of the organization. –Prepare presentations for the agency director to ensure that the findings are incorporated into the director’s thinking and public statements.

Ad Hoc Evaluations Constantly remind decision makers of findings and recommendations (contd.) –Make recommendations for similar changes in other program areas so the organization’s members begin to think beyond the one-time implications of a single evaluation. –Remind other managers of evaluation results during critical executive committee meetings or in informal sessions.

Ad Hoc Evaluations Share findings and recommendations with broad audiences. –Stakeholders rarely are interested in the methodology, but are greatly interested in how the program should change as the result of the new information from the study. –Evaluation findings and recommendations should be presented in a concise, factual executive summary with a technical appendix or report available for a complete understanding of the methodology and statistics used. –Identify the limitations of the design and findings, but highlight the recommended options available to decision- makers.

Ad Hoc Evaluations Share findings and recommendations with broad audiences (contd.). –Sharing results with broad audiences. General public – interviews. Oversight organizations – briefings. University – lists of completed studies, copies of studies, and implementation information. Other professional staff – share results, publish results in professional journals.

Ad Hoc Evaluations Assign evaluation staff to assist in implementing recommendations. –Evaluation staff can gain valuable program experience and future credibility by being assigned to assist program staff in implementing recommendations from evaluation findings.

Ongoing Performance (Outcome) Monitoring The key problem of ad hoc evaluation is the limited usefulness window for cross- sectional information.

Ongoing Performance (Outcome) Monitoring One way to provide continuous evaluation data is to design a strategy for ongoing collection of client outcome information. –With such data the organization might regularly develop ways to improve the program by assessing current outcome impact. Public organizations need to institutionalize the collection of outcome information so it is regularly assessed and released in management reports.

Ongoing Performance (Outcome) Monitoring Uses for outcome information. –Orienting advisory board members on the impact of programs. –Providing outcome data to demonstrate effectiveness of programs. –Use of outcome information for marketing programs. –Use of outcome information to identify weaknesses to improve program performance. –Demonstrate accountability.

Ongoing Performance (Outcome) Monitoring Uses for outcome information (contd.). –Support resource allocation requests. –Justify budget requests. –Bolster employee motivation. –Support performance contracting. –Implement quality control checks on efficiency measurement. –Enhance management control. –Improve communication between citizens and government officials. –Improve services to customers.

Ongoing Performance (Outcome) Monitoring Elements to encourage greater use of outcome monitoring data. –Timely data. Six months. –Detailed breakouts. Disaggregate by important program characteristics. –Worker participation. Program staff should participate actively in both the selection of the outcome measures and the information collection.

Ongoing Performance (Outcome) Monitoring Elements to encourage greater use of outcome monitoring data (contd.). –Perception that data are valid. Check accuracy. Public results of validity check. Cross-check outcomes. Obtain high response rate. Ensure high face validity.

Ongoing Performance (Outcome) Monitoring Elements to encourage greater use of outcome monitoring data (contd.). –Demonstrate the usefulness of the outcome data. Encourage and train managers to seek information as to which program characteristics are impacting or explaining changes in the outcomes. Encourage program managers to provide explanatory information along with the performance reports they provide to higher levels, particularly if reported outcomes differ from expected outcomes. Publicize the outcomes for work units internally to create constructive competition between work groups to improve outcome results. Require program managers to estimate the impact of their budget requests on subsequent outcome levels. Encourage program managers to use outcome information in their speeches, daily discussions, meetings, and press interviews.

Ongoing Performance (Outcome) Monitoring Elements to encourage greater use of outcome monitoring data (contd.). –Repeat the measurements. Outcome measurements should be collected at least quarterly. –Changes over time can be identified. –Agency gains more confidence in consistent data. –Development of an historic database. –Mandating outcome reporting. Trend at federal and local level to require the collection of outcome information. Program outcome reporting should be required as part of budget justification.

Ongoing Performance (Outcome) Monitoring Elements to encourage greater use of outcome monitoring data (contd.). –Develop appropriate information systems. Some outcome information has unique characteristics that require adaptations of current program data collection systems to track clients over time, especially after they leave the program.

Conclusions Ad hoc evaluations. –Develop realistic recommendations that focus on program improvement. –Explore multiple uses of study data. –Constantly remind decision makers of findings and recommendations. –Share findings and recommendations with broad audiences. –Assign evaluation staff to assist in implementing recommendations.

Conclusions Ongoing performance (outcome) monitoring. –Timely reports should be provided. –Reports should include detailed breakdowns by program, client, and worker characteristics. –Program staff should actively participate in defining outcome measures and in the data collection process. –Outcome data should have high face validity. –The use of outcome information should be demonstrated. –Outcome measurements should be repeated on a regular basis.

Conclusions Ongoing performance (outcome) monitoring (contd.). –Performance monitoring can be mandated to ensure the collection of data over time and different political administrations. –Outcome information systems may need to be modified to reflect the unique needs of tracing clients over time.