Kathryn Race, Race & Associates, Ltd and Susan Wolfe, Susan Wolfe and Associates, LLC American Evaluation Association Conference Anaheim, CA November 2011.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Institutional Change as a Framework for GE Reform Mary E. Boyce AAC&U General Education and Assessment Conference March 2007.
WTO Symposium, GENEVA,08 November, 2011 Study of Trade Facilitation Initiatives in Pakistan Implementing trade facilitation reforms: Aligning national.
ENQA seminar:First external evaluations of quality assurance agencies – lessons learned Panel discussion: Practicalities and challenges of self and external.
Leon County Schools Performance Feedback Process August 2006 For more information
Agenda For Today! Professional Learning Communities (Self Audit) Professional Learning Communities (Self Audit) School Improvement Snapshot School Improvement.
Scottish Learning and Teaching Strategies Support Group Academy Scotland - Enhancement and Engagement 24 May 2007.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Literacy in the middle years of schooling focusing on Aboriginal Students.
Understanding By Design: Integration of CTE and Core Content Curriculum Michael S. Gullett.
School Leadership that Works
Local Control and Accountability Plan: Performance Based Budgeting California Association of School Business Officials.
World Heritage Periodic reporting Latin America and the Caribbean Carolina Castellanos / Mexico.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Companion Lectures For Progressive Community Organizing in a Globalizing World By Loretta Pyles Copyright © 2014, Loretta Pyles. From Progressive Community.
Leading with Wonder National Title I Conference February 2015 U.S. Department of Education Office of State Support (OSS)
The Purpose of Action Research
© Grant Thornton UK LLP. All rights reserved. Review of Sickness Absence Vale of Glamorgan Council Final Report- November 2009.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Research Methods for Business Students
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
Public Policy and Risk Management: Lessons in Implementation Andrew Graham School of Policy Studies Queens University.
Maintenance and Renewal of Full Registration Current as at November 2014.
Shoppers Drug Mart: Our Approach to Government Relations Presentation to the Canadian Assistive Devices Association 19 September 2012.
LEADING ONLINE: An autoethnography focused on leading an instructional focus on student learning in an online school DOCTOR OF EDUCATION WASHINGTON STATE.
1 October, 2005 Activities and Activity Director Guidance Training (F248) §483.15(f)(l), and (F249) §483.15(f)(2)
TEACHING FOR CIVIC CAPACITY AND ENGAGEMENT : How Faculty Align Teaching and Purpose IARSLCE 2011 | CHICAGO Jennifer M. Domagal-Goldman | November 3, 2011.
IAEA International Atomic Energy Agency Reviewing Management System and the Interface with Nuclear Security (IRRS Modules 4 and 12) BASIC IRRS TRAINING.
The Academy of Pacesetting Districts Introducing...
1 Community-Based Care Readiness Assessment and Peer Review Team Procedures Overview Guide Department of Children and Families And Florida Mental Health.
The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office.
Advancing Assessment Literacy Setting the Stage I: Engaging Stakeholders.
The Role of the NCATE Coordinator Kate M. Steffens St. Cloud State University NCATE Institutional Orientation September, 2002.
Jonathan Cohen, Ph.D. National School Climate Center: Educating Minds and Hearts Because the Three Rs’ Are Not Enough; Teachers College, Columbia University.
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
Family & Professional Networks in Disability Policy: A Qualitative Inquiry.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
Why Do State and Federal Programs Require a Needs Assessment?
BSc (Hons) Social Work Working Across Organisations Assessment Event Briefing. An overview of the teaching within this unit and briefing on the assessment.
School Improvement Partnership Programme: Summary of interim findings March 2014.
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
PE Standards. What are the purpose of educational standards? –All disciplines have them.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
October 20 th, Beliefs and Expectations for Site Council Seek and listen to the insights of all stakeholder perspectives and groups. Deal with issues.
District Leadership Module Preview This PowerPoint provides a sample of the District Leadership Module PowerPoint. The actual Overview PowerPoint is 73.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
1 Community-Based Care Readiness Assessment and Peer Review Overview Department of Children and Families And Florida Mental Health Institute.
Evaluation capacity-building: An insider’s tale of value- adding Diana Beere.
Discuss how researchers analyze data obtained in observational research.
External Audit as a Catalyst for Institutional Development – A South African Perspective EAIR Conference: August 2009 Presenter: Martin Oosthuizen.
The Square Whole Ltd © The Square Whole Ltd 2011 Equality Framework Driving Performance Through Equality Min Rodriguez The Square Whole Ltd © The Square.
Teacher/Scientist Research Internships Summer 2010 Professional Development Study Groups.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Nevada Department of Education Office of Educational Opportunity Nevada Comprehensive Curriculum Audit Tool for Schools NCCAT-S August
Overview.  picture  I have now been in the AASCF- Outcome Based Service Delivery Lead position for ten months and am still very excited about the positive.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Fifth Edition Mark Saunders, Philip Lewis and Adrian Thornhill 2009 Research Methods for Business Students.
Australian National Audit Office Better Practice Guide: Implementation of Programme and Policy Initiatives Presentation to the Canberra PMI Chapter 7 March.
AUDIT STAFF TRAINING WORKSHOP 13 TH – 14 TH NOVEMBER 2014, HILTON HOTEL NAIROBI AUDIT PLANNING 1.
Facilitating Political and Policy Awareness for Advanced Practice Nurses Cynthia Stone DrPH, RN Indiana University APHA November 7, 2007.
Action Research for School Leaders by Dr. Paul A. Rodríguez.
Mason County Schools Policy 5310 August 11, 2016.
Research Methods for Business Students
Presentation to the SIGnetwork SPDG Directors’ Session
Creativity, Activity, Service
Mason County Schools Policy 5310 August 11, 2016.
Policy Process and Extension Communication Methods:
Shasta CCD Board Retreat CEO Search, Accreditation & Student Success
Presentation transcript:

Kathryn Race, Race & Associates, Ltd and Susan Wolfe, Susan Wolfe and Associates, LLC American Evaluation Association Conference Anaheim, CA November 2011

Purpose of Project Contracted by a major membership association to assess the degree to which reform in two large state human service agencies aligned with a comprehensive framework which was collaboratively developed to facilitate and guide this reform. External evaluators were brought into the project after this framework had been developed. Evaluators asked, using this framework, to assess the level of reform that these agencies had implemented through a multi-method approach that included government representatives and providers of these services.

Overview of Methodology Conduct and complete study within a 9-month period engaging stakeholders in data collection within a 6- month period. Government representatives were asked to complete an on-line survey. Conduct site visits at participating human service agencies that included individual interviews as well as focus groups, one with each agency. On-line survey to human service providers who contract with either one or both of these agencies.

Challenges Evaluators brought into the project midway – not engaged in the planning stage of this work. Government representatives not involved in the design and development of the framework. An aggressive (read short) time line used to get the work done. Contracting agency heavily involved in the design of the study. And, as we shall see, heavily involved at the end – when results were being generated and findings were being interpreted.

Considering the Political Context Eleanor Chelimsky (2009) wrote in reflection about her experience as the Assistant Comptroller General of the U.S. General Accounting Office for Evaluation and Methodology for 14 years: “Most discussions of evaluation policy focus on the substance and process of doing evaluations. However, doing evaluation in government requires careful consideration not only of evaluation but also of the larger political structure into which it is expected to fit.”

Political Context (con’t.) Chelimsky (2009) goes on to say: “success for evaluation in government depends as much on the political context within which evaluation operates as it does on the merits of the evaluation process itself.” Chelimsky (2009). Integrating evaluation units into the political environment of government: The role of evaluation policy. In W.M. Trochim, M.M. Mark & L. J. Cooksy (Eds.), Evaluation Policy and Evaluation Practice New Directions for Evaluation., 123,

Political Pressures Three levels of political pressures, those stemming from: Democratic structure and process itself Bureaucratic climate Dominant professional culture within the agency

Political Pressures (con’t.) Democratic structure and process itself overall checks and balance -- vying for power, engaging in political partisanship can create a tug- of-war over the entire study process, drive policy development and dominant the evaluative agenda

Political Pressures (con’t.) Bureaucratic climate the welfare of the agency is the primary political concern, optimizes agency defense, fostering self- protective, and territorial behavior

Political Pressures (con’t.) Dominant professional culture within the agency (might be law, science, audit) political pressures on evaluation focus on differences between the methods, standards, and values of one profession versus those of another

Political Pressures (con’t.) All three levels can exert pressure concurrently Political pressure directed at the present evaluation was most evident from: Bureaucratic climate Dominant professional culture within the agency

Political Pressures (con’t.) Can (and in the present case did) affect the evaluation at two important stages: At the design stage (the use of a particular method, selecting the sample) At the final stage (when results were being generated and findings were interpreted)

At the Design Stage Given the opportunity, what could we have done differently? Preferably we should have been involved in the project much earlier: To provide more opportunity to understand the political context And perhaps use a different methodology or approach in this study

At the Design Stage: Different Methodology First, it would have been imperative to have included government representatives in the initial development of the framework. Given the chance, we may have elected to use work groups instead of the survey methodology that was used. Given a work-group methodology we may have actually had the opportunity to use a Delphi method to seek consensus around priority areas, develop strategies to use to bring about reform, and identify potential change agents. May have been able to systematically divide the framework into workable bits rather than treat the framework as a whole set (current framework does not address how change might happen).

At the Report Stage The contrast between the dominant professional culture within the agency and evaluation culture was perhaps most evident at this stage: Agency culture supported results presented as bullet points or sound-bits, in contrast to A detailed, comprehensive evaluation report that was not compatible with the dominant culture of the agency

At the Report Stage (con’t.) Resolved these issues by: Preparing a summary report that was much shorter and more manageable Creating a summary table, which condensed a large amount of information (read dashboard or report card)

Context in which the Evaluation is Conducted Chelimsky (2009) noted: The context never stands still, that is ideas, policies, and political alliances are in constant flux, and, this has ramifications for both the evaluation process and the policy use of findings.

How We Might Have Contained this Movement: The development and use of a Policy Theory of Change Model (ala a Program Theory Model or Theory of Change Model) would have substantially grounded the work* Such a model may also have slowed down the morphing that occurred during the project or at least provided the structure to help identify that a change in conceptual thinking was occurring Such a model would have placed the necessary strategies and identified agents of change needed to support this reform front and center in these discussions (a necessary companion to a work- group methodology). *[ See for example: Donaldson, S. I. (2007). Program theory-driven evaluation science: Strategies and applications. New York: Lawrence Erlbaum. Falcone, S. (2003). Modeling differences: The application of the logic model to public policy analysis. The Innovative Journal 8 (June – August).]

Evaluation of an Initiative to Reform Government Contracting for Human Services : Part II Future Implications and Lessons Learned Contact Information: Kathryn Race, Race & Associates, Ltd. Susan Wolfe, Susan Wolfe and Associates, LLC.