Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,

Slides:



Advertisements
Similar presentations
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Advertisements

Donald T. Simeon Caribbean Health Research Council
Ohio Improvement Process (OIP) August Core Principles of OIP  Use a collaborative, collegial process which initiates and institutes Leadership.
Ohio Improvement Process (OIP) Your Local School District District Team Orientation Date Time.
SELLING AND SALES MANGEMENT Chapter Three Territory Development And Time Management.
Program Evaluation It’s Not Just for OMB Anymore….
Staffing And Scheduling.
Accountability in Human Resource Management Dr. Jack J. Phillips.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Happy semester with best wishes from all nursing staff Dr Naiema Gaber
Chapter 15 Evaluation.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Types of Evaluation.
Developing the Marketing Plan
13 Management Control Systems, The Balanced Scorecard, and Responsibility Accounting.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Quality Improvement Prepeared By Dr: Manal Moussa.
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
HSA 171 CAR. 1436/ 7/4  The results of activities of an organization or investment over a given period of time.  Organizational Performance: ◦ A measure.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Software Estimation and Function Point Analysis Presented by Craig Myers MBA 731 November 12, 2007.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Kentucky Statewide Coordinated Statement of Need b RWCA statutory references to the SCSN: b “a description of how the allocation and utilization of resources.
1 Unit 1 Information for management. 2 Introduction Decision-making is the primary role of the management function. The manager’s decision will depend.
Outcome Performance Measures Presentation Derived from Martin & Kettner’s Measuring the Performance of Human Service Programs, Sage, 1996.
Adult Education and Literacy Budget Development and Cost Allocation.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
City of Rio Vista Budget 101 An Overview of Essential Tools CITIZENS OF RIO VISTA CITY STAFF CITY COUNCIL.
Statewide System of Support The Ohio Story: Federal Response.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
Output Performance Measures Presentation Derived from Martin & Kettner’s Measuring the Performance of Human Service Programs, Sage, 1996.
Quality Assuring Deliverers of Education and Training for the Nuclear Sector Jo Tipa Operations Director National Skills Academy for Nuclear.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Catholic Charities Performance and Quality Improvement (PQI)
Pete Williams Governor’s Office for Innovation in Government.
1 School Health Advisory Council (SHAC) Welcome San Benito CISD.
IT Leading the Way to Institutional Effectiveness Presenter: Kendell Rice, Ph.D. July 11, 2007.
Quality Control Dr. Waddah D’emeh. Controlling Fifth and final step of the management process. Performance is measured against predetermined standards.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Continual Service Improvement Methods & Techniques.
HLC Criterion Five Primer Thursday, Nov. 5, :40 – 11:40 a.m. Event Center.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
August 16, 2011 MRT Managed Long Term Care Implementation and Waiver Redesign Work Group.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
PERFORMANCE MEASURES GROUP 4. DEFINITION PERFORMANCE MEASURES. These are regular measurements of outcomes and results which generates reliable data on.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
January 23,  Balance state’s higher education long range plan and agency operations in the required strategic plan;  Involve agency staff in.
How to Assess the Effectiveness of Your Language Access Program
Communicate the Impact of Poor Cost Information on a Decision
Communicate the Impact of Poor Cost Information on a Decision
Communicate the Impact of Poor Cost Information on a Decision
Communicate the Impact of Poor Cost Information on a Decision
It’s Not Just for OMB Anymore…
Brodhead, Cox, and Quigley (2018)
MONITORING AND EVALUATION IN TB/HIV PROGRAMS
Performance and Quality Improvement
Presentation transcript:

Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation, Sage, 1990; Martin & Kettner’s Measuring the Performance of Human Service Programs, Sage, 1996

Stages of Assessment Stage 1: Determining Whether the Program is Reaching the Appropriate Beneficiaries. Stage 2: Making Sure the Program is Being Properly Delivered Stage 3: Ensuring Funds Are Being Used Appropriately Stage 4: Ensuring Effectiveness can be Estimated Stage 5:Determining Whether the Program Works Stage 6: Determining Program Worth

Stage One: Program Impact Program Impact Research is designed to identify who is actually served by a program to determine the number being served that meet program service criteria and those that are being served that do not meet service criteria.

Stage Two: Program Integrity Program Integrity Research analyzes the essentials of program delivery such as: personnel qualifications & skill assessment consistency of program services with program mission targeting & marketing of services service coordination

Stage Three: Fiscal Accountability Accountant Perspective: Out of Pocket Expenses Historical Costs Depreciation Current & Anticipated Revenues Product Inventory Income & Outgo of Funds

Stage Three: Fiscal Accountability Economist Perspective & Opportunity Costs Opportunity Costs May be Considered as What was Given Up to Direct Resources in a Particular Direction Opportunity Costs May Also Be Construed as the “Next Best Use” of Resources

Stage Four: Evaluability Criteria for Evaluability Clarifying Goals Specifying Program Goals Determining Possible Outcomes in the Absence of the Program

Stage Five: Program Effectiveness Comparisons Across Subjects Comparison Across Settings Comparison Across Time Comparison Across Criterion Pooled Comparisons

Research Designs for Estimating Effectiveness Random Assignment: Comparing Mean Outcomes of Control & Experimental S’s Interrupted Time Series: (Pre- and Post- Assessment Model) Cross Sectional Designs: Comparisons of Different Types of Units (e.g. comparing smaller & larger cities) with Comparisons Occurring at Only One Point In Time

Research Designs for Estimating Effectiveness Regression Time Series: Assignment of S’s by variables (Criterion Based). Objectives : –To provide estimates of values of the dependent variable (outcome variable) from values of the independent variable (assignment Variable) –To obtain measures of the error involved in using the regression line as a basis of estimation (I.E. Standard Error of Estimate) –To obtain a measure of the degree of association or correlation between the two variable

Research Designs for Estimating Effectiveness Pooled Cross Sectional & Time Series: –Randomized Experiments & –Regression Designs May Be Compared –Across Units (Cross Sectional) & –Across Time (Time Series)

Stage Six: Cost Effectiveness Ongoing Versus New Programs: Ongoing Programs Have Historical Data to work With. New programs lack such historical data from which to determine cost effectiveness

Performance Measurement

Defining Performance Measurement The regular collection of and reporting of information about the efficiency, quality, and effectiveness of human service programs. (Urban Institute, 1980)

Perspectives of Performance Measurement Efficiency Perspective Quality Perspective Effectiveness Perspective

Systems Model Essentials Inputs: Includes anything used by a system to achieve its purpose Process: Involves the treatment or delivery process in which inputs are consumed to produce outputs Outputs: That which is produced Feedback: System information reintroduced into the process to improve quality, efficiency & effectiveness

Efficiency Perspective Productivity = ratio of outputs to inputs Efficiency = maximizing outputs to inputs –Efficiency can not reflect whether program goals are being met –Inefficiency is how many programs are regarded by the public - often in the absence of a full understanding of the goals, mission, clientele, resources, and services of the agency

Quality Perspective Typically involves benchmarking against standards and criteria of excellence (as in TQM, or Total Quality Management) TQM now defines productivity as the ratio of outputs that meet a specified quality standard

Effectiveness Perspective Focuses on outcomes such as the results, impacts and accomplishments of programs Effectiveness is the highest form or performance accountability Focuses upon which intervention works in which settings Effectiveness accountability is primarily concerned with ratios of outcomes to inputs.

Reasons for Adopting Performance Measurement Performance measurement has the potential to improve the management of human service programs Performance measurement has the potential to affect the allocation of resources to human service programs Performance measurement may be a forced choice for many, if not, most human service programs

Key Questions in Performance Measurement Who are the clients? What are their demographic characteristics? What are their social or presenting problems? What services are they receiving? In what amounts? What is the level of service quality? What results are being achieved? At what costs?

Performance Measurement as a Management Tool Performance Measurement promotes client centered approaches to service delivery Provides a shared language for comparing human service programs for quality, efficiency, & effectiveness Allows administrators to continuously monitor programs to identify areas for improvement Provides direct feedback to personnel, allowing them to improve their service provision

Performance Measurement Programs

Government Performance & Results Act (1993) Effective 1998, all federal agencies must begin reporting effectiveness data for their services & products This requirement will be passed on to agency contractors & subcontractors Increasingly Federal block - grant programs also have this requirement

National Performance Review Refers to governmental efforts at instituting program effectiveness, efficiency, and quality, to implement the 1992 report on government practices entitled Reinventing Government (Osborn & Gaebler, 1992)

Total Quality Management Movement National Movement to Improve Quality Focuses upon: –consumer satisfaction –outputs as measured against a quality standard

Managed Care Emanates from health care Promotes efficiency to assist health care industry shift from cost-based to capitated reimbursement

(SEA) Service Efforts and Accomplishments Reporting Standard introduced by the Governmental Accounting Standards Board (GASB) SEA is GASB’s term for performance measurement

SEA Reporting Model Built upon an expanded system model including: inputs outputs quality outputs, & outcomes BUT Excludes Process

SEA’s Lack Of Emphasis Upon Process Absence of the Process component reflects SEA’s primary emphasis upon performance & performance cost considerations

SEA Reporting Elements Service Efforts Service Accomplishments Measures or Ratios Relating Service Efforts to Service Accomplishments

Service Efforts Service Efforts are inputs utilized in a human service program, which are measured by the GASB in terms of Total Program Costs Total Full Time Equivalent Staff (FTE) Total Number of Employee Hours

Service Accomplishments Outputs: Total Volume of Total Service Provided Proportion of Total Service Volume Meeting Quality Standard Outcomes: Measures of results, accomplishments, impacts

Service Accomplishment Ratios Efficiency (output measures) cost per unit of service –cost per FTE –cost per service completion –service completions per FTE Effectiveness (outcome measures) –cost per outcome –outcome per FTE

Output Performance Measures Intermediate: –episode or contact unit of service –material unit of service Final: Service completions

Outcome Performance Measures Intermediate: –numeric counts, –standardized measures –level of functioning scales –client satisfaction Ultimate: –numeric counts –standardized measures –level of functioning scales