Evaluation and Performance Measurement Use in Government: Use This Use David J. Bernstein, Ph.D., Senior Study Director, Westat 2012 American Evaluation.

Slides:



Advertisements
Similar presentations
Yvonne Belanger, Duke University Library Assessment Conference
Advertisements

Pursuing Effective Governance in Canada’s National Sport Community June 2011.
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
Using the New CAS Standards to Assess Your Transfer Student Programs and Services Janet Marling, Executive Director National Institute for the Study of.
State of New Hampshire Department of Administrative Services Division of Personnel Workforce Development Strategic Plan.
What You Will Learn From These Sessions
Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement.
Orientation for New Site Visitors CIDA’s Mission, Value, and the Guiding Principles of Peer Review.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
PPA 502 – Program Evaluation
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 3: Engaging stakeholders.
A Review ISO 9001:2015 Draft What’s Important to Know Now
Research Methods for the Social Sciences: Ethics Ryan J. Martin, Ph.D. Thomas N. Cummings Research Fellow March 9, 2010.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Negotiating Diverse Contexts and Expectations in Stakeholder Engagement Sue Lin Yee, MA, MPH National Center for Injury Prevention and Control Office of.
Internal Auditing and Outsourcing
Eric R. Johnson Hillsborough County, (Tampa) FL
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 2: Developing an evaluation plan.
Challenges of Global Alcohol Policy Developments FIVS Public Policy Conference 7-9 April 2014 Brussels, Belgium.
Effective Collaboration For Serious Violent Offender Reentry David Osher, Ph.D. Center for Effective Collaboration and Practice Technical Assistance Partnership.
Transitioning to the COSO 2013 Update.  Released on May 14, 2013  Designed to build upon the foundation of the 1992 Framework  Will supersede the 1992.
Too expensive Too complicated Too time consuming.
EVALUATION 101: HOW CAN WE DEMONSTRATE PROGRAM RESULTS? Jon E. Burkhardt, Westat Dr. David J. Bernstein, Westat Prepared for the National Center on Senior.
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Designing the Evaluation Sonal R. Doshi MS, MPH Program Evaluator CDC/OSTLTS/DPHPI TB Education, Training, and Evaluation Lights, Camera, Action: Setting.
TDRp Implementation Challenges David Vance, Executive Director Peggy Parskey, Assistant Director October 23, 2014.
Governance Orientation. Governance Model 1 Principles maintain future orientation and outcomes focus Board representing the ownership capture decisions.
WORKSHOP OBJECTIVES  To describe a research study designed to promote the involvement of older people and carers in strategic planning processes  To.
Developing Effective Performance Measures: A Skill-Building Workshop David J. Bernstein, Ph.D., Senior Study Director, Westat Sponsored by the Government.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Extending Your Evaluation's Short Half-life So It Is Longer Than a Day: A Personal Perspective David J. Bernstein, Ph.D. Senior Study Director, Westat.
Evaluation for Social Justice AMY HILGENDORF, PHD KATE WESTABY, MS VICTORIA FAUST, MPA UNIVERSITY OF WISCONSIN-MADISON American Evaluation Association.
Jeanette Gurrola Psychology Department School of Behavioral & Organizational Sciences Claremont Graduate University American Evaluation.
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
BLM Decision Making Process
Mgt Project Portfolio Management and the PMO Module 8 - Fundamentals of the Program Management Office Dr. Alan C. Maltz Howe School of Technology.
Principles of Good Governance
Board Roles & Responsibilities
The Accountant’s Role in the Organization
Designing Careers in Evaluation and Preparing Evaluation Students and New Evaluators for the Profession 2016 American Evaluation Association Conference.
MGMT 452 Corporate Social Responsibility
Designing an Ethical Online Study
Supporting Community Priorities and Emphasizing Rigor An Approach to Evaluation Capacity Building with Tribal Home Visiting Programs Kate Lyon, MA Julie.
Review of Selected Evaluation Approaches
American Public Health Association November 5, 2007 Session #3196
Strategic Management by INTOSAI Regions – A guidance
David J. Bernstein, Ph.D. Senior Study Director, Westat
Chapter 6 Project Management and Project Cycle Management.
NAMI California Conference Presentation on June 1, 2018 Monterey, CA
Parent-Teacher Partnerships for Student Success
Facilitating UFE step-by-step: a process guide for evaluators
Corporate Governance It is a system by which companies are managed and directed in the best interests of the owners and shareholders. It refers to the.
Strategies Increasing Student Retention & Success
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Family and Community Services
The Accountant’s Role in the Organization
Parent-Teacher Partnerships for Student Success
35 Years of Evaluation Learning in 5 Minutes
Facilitating UFE step-by-step: a process guide for evaluators
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Elements of evaluation quality: questions, answers, and resources
Transformation of the National Statistical System: Experience
Director-General: Mr. E Africa
The GEF Public Involvement Policy
MINISTRY OF DEVOLUTION AND PLANNING
Using Data to Build LEA Capacity to Improve Outcomes
LEARNER-CENTERED PSYCHOLOGICAL PRINCIPLES. The American Psychological Association put together the Leaner-Centered Psychological Principles. These psychological.
Presentation transcript:

Evaluation and Performance Measurement Use in Government: Use This Use David J. Bernstein, Ph.D., Senior Study Director, Westat 2012 American Evaluation Association Conference Evaluation Use and Government Evaluation Topical Interest Groups Panel Session 757, 211 C/D October 27, 2012, 8:00 to 9:30 a.m.

Outline of the Presentation 1. Compare and contrast Patton's evaluation- utilization concepts with Epstein’s performance measurement (PM) use model. 2. Apply Patton's evaluation-utilization premises in a government performance measurement framework, AKA “utilization-focused performance measurement.” 3. Provide examples of the divergent and diverse evaluation and PM needs of sponsors (funders) and evaluation/program stakeholders. 2

Patton and Epstein: Use These Uses 3 Patton Three Primary Uses of Evaluation: 1. Judging Merit and Worth 2. Improving Programs 3. Generating Knowledge Epstein Three Uses of Performance Measures 1 1. Improve Public Accountability/Communication 2. Improving Service Performance 3. Improve Decision Making Four uses of evaluation processes: 1. Enhancing shared understandings 2. Reinforcing interventions 3. Supporting participant engagement 4. Developing programs and organizations Source: Patton 1997, Fundamental Premises of Utilization-Focused Evaluation #6, p Four uses of performance measures 2 1. Improve Public Accountability/Communication 2. Improve Service Performance, Decision Making 3. All 3 of Epstein’s PM uses. 4. All 3 of Epstein’s PM uses. Source: 1 Epstein 1988, p Bernstein extrapolation of Epstein.

Patton’s 3 Primary Uses of Evaluation Findings and Complementary PM Uses A. Evaluation usesB. Patton’s Examples 1. Judge merit or worthSummative evaluation Accountability Audits Accreditation/licensing Quality control Cost-benefit decisions Decide a program’s future 2. Improve programsFormative evaluations I.D. strengths/weaknesses Continuous improvement Quality enhancement Learning organization 3. Generate knowledgeGeneralizations about effectiveness Theory building Synthesize patterns Policy making 4 Source:Column A, B: Patton 1997, p. 76 Column C: Bernstein 2000 C. Complementary PM Uses Outcome monitoring Reporting results Performance, other audits PM audits Customer service measures Performance-based budgets Cost-benefit analysis Performance benchmarking Program/process monitoring I.D. strengths/weaknesses Continuous improvement Baldrige Performance Excellence Don’t do it! Evaluation uses PMs, but PM is not evaluation! Theories of change/Logic modeling Trend analysis Policy monitoring

Intended/Unintended Users and Uses Unintended UsesIntended Uses Unintended Users Unintended Trouble Unintended Success Intended Users Unintended Need for More Technical Assistance *MQP* 5

Government Examples: Premise 1, 3 P1.Commitment to intended use by intended users should be the driving force in an evaluation. At every decision point…the evaluator asks intended users, ‘How would that affect your use of this evaluation?’ P3. The personal factor significantly contributes to use…evaluations should be specifically user oriented. Examples: Montgomery County Finance Strategic Planning and Performance Measurement Development. Evaluation of the Helen Keller National Center (Rehabilitation Services Administration, U.S. Department of Education) 6

Government Examples: Premise 4-7 P4.Careful and thoughtful stakeholder analysis should inform identification of primary intended users…[Multiple stakeholders] have an interest in evaluation, but the degree and nature of their interests will vary. Political sensitivity and ethical judgments are involved in identifying primary intended users and uses [emphasis added]. P5.Evaluations must be focused in some way; focusing on intended use by intended users is the most useful way…Because no evaluation can serve all potential stakeholders’ interests equally well, stakeholders representing various constituencies should come together to negotiate what issues and questions deserve priority. P6.Focusing on intended use requires making deliberate and thoughtful choices. P7.Useful evaluations must be designed and adapted situationally. Example: Montgomery County Central [Arrest] Processing Pilot Program. 7

Government Examples: Premise 13 P13.Use is different from reporting and dissemination. Reporting and dissemination may be means to facilitate use, but they should not be confused with such intended uses as making decisions, improving programs, changing thinking, empowering participants, and generating knowledge (see premise 6). Example: Governmental Accountability Standards Board Research on Concept Statement #2: Service Efforts and Accomplishments. Led to changes published in Governmental Accountability Standards Board Concept Statement #5, including removal of a section of Concepts Statement 2, titled, “Developing Reporting Standards for SEA Information.” 8 Source: NEWS RELEASE 12/15/08: GASB Issues Concepts Statement No. 5, Service Efforts and Accomplishments Reporting (an amendment of GASB Concepts Statement No. 2). ewsPage&cid= ewsPage&cid=

Government Examples: Premise 11 P11.Evaluators have a rightful stake in an evaluation in that their credibility and integrity are always at risk, thus the mandate for evaluators to be active-reactive-adaptive…Evaluators are active in presenting to intended users their own best judgments about appropriate evaluation focus and methods…Evaluators’ credibility and integrity are factors affecting use as well as the foundation of the profession. In this regard, evaluators should be guided by the profession’s standards and principles. AEA Guiding Principles for Evaluators A. Systematic Inquiry. B. Competence. C. Integrity/Honesty. D. Respect for People. E. Responsibilities for General and Public Welfare. Examples: Most evaluations on which I’ve worked since July Source: American Evaluation Association

References American Evaluation Association. Guiding Principles for Evaluators. Downloaded from Bernstein, D.J. (2000). Local government performance measurement use: Assessing system quality and effects. George Washington University, Washington, DC. Available from UMI-ProQuest, x=10020:22372:US&track=DxWeb&dlnow=1&rpath=http%3A//dissexpress.umi.com/dx web%3F%26query%3DAU%2528david%2520joseph%2520bernstein%2529%26page %3D1. x=10020:22372:US&track=DxWeb&dlnow=1&rpath=http%3A//dissexpress.umi.com/dx web%3F%26query%3DAU%2528david%2520joseph%2520bernstein%2529%26page %3D1 Bromberg, D. (2011). Use Me. Westchester, PA: Appleseed Recordings. Epstein, P. (1988). Using Performance Measurement in Government: A Guide to Improving Decisions, Performance, and Accountability. New York, NY: National Civic League Press. Patton, MQ. (1997). Utilization-Focused Evaluation: The New Century Text, Edition 3. Thousand Oaks, CA: Sage Publications. (Note: a 4 th Edition, 2008), is available. 10

Contact Information 11 For a copy of this PowerPoint or other information, contact: David J. Bernstein, Ph.D. Senior Study Director, Westat 1600 Research Blvd, RA1292 Rockville, MD