TIA Consulting, Inc. Starting an Evaluation Program: Challenges and Lessons Learned Technology Program Evaluation: Methodologies from the Advanced Technology.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

ARRA Proposal Communications Campaign. Early Education and Care System Components Informed Families and Public (FS, C, I) Finance (Q, FS, WF, I) EEC Strategic.
Quality Assurance Review Team Oral Exit Report District Accreditation Forsyth County Schools February 15, 2012.
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
HR Manager – HR Business Partners Role Description
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Project Monitoring Evaluation and Assessment
Identification of critical success factors for implementing NLLS, through collaboration and exchange of expertise IDENTIFY LLP-2008-RO-KA1-KA1NLLS.
Comprehensive M&E Systems
B&O Committee May 2015 iTRAK - Change Management An Agency Adapting to Change.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Challenge Questions How good is our operational management?
Evaluation. Practical Evaluation Michael Quinn Patton.
Quality Assurance Review Team Oral Exit Report School Accreditation Bayard Public Schools November 8, 2011.
Political Leadership How to influence! And Current OH Issues Carol Bannister Royal College of Nursing of the United Kingdom.
Strengthening the quality of research for policy engagement in the African context – achievements and aspirations Tebogo B. Seleka Botswana Institute for.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Nov. 18, 2003 TIA Consulting, Inc. Measuring Returns to Research in the Public Sector Rosalie Ruegg TIA Consulting, Inc. Research Money.
PART II – Management Audit: Basic Standards, Values and Norms Shared by Pratap Kumar Pathak.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
WRITING EFFECTIVE GRANT PROPOSALS With an Eye toward Performance and Evaluation Issues.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
Michigan Department of Education Office of Education Improvement and Innovation One Voice – One Plan Michigan Continuous School Improvement (MI-CSI)
A Peer Education Approach to Sexuality Education in Schools Melissa Blake Melissa Reagan Princeton Center for Leadership Training AAHE-AAHPERD National.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Presented by Linda Martin
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
A 40 Year Perspective Dr. Frank Scioli NSF-Retired.
1 Using Logic Models to Enhance Evaluation WESTAT Center to Improve Project Performance (CIPP) Office of Special Education Programs Amy A. Germuth, Ph.D.
国 家 科 技 部 评 估 中 心国 家 科 技 部 评 估 中 心 National Center for S&T Evaluation Recent Experiences and Challenges of Research Program Evaluation in China: An Introduction.
E VIDENCE -B ASED N ATIONAL P OLICIES : S IGNIFICANCE AND I MPLICATIONS John Carnevale, Ph.D. Carnevale Associates, LLC Presentation.
Quality Assurance Review Team Oral Exit Report District Accreditation Bibb County Schools February 5-8, 2012.
Regional Training/Consultations on Capacity Development for Sustainable Growth and Human Development in Europe and CIS Application of Capacity Development.
BCO Impact Assessment Component 3 Scoping Study David Souter.
School Improvement Partnership Programme: Summary of interim findings March 2014.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
© 2007 SRI International CPATH Principal Investigators Meeting: Program Evaluation Update March 26, 2010 By the Center for Education Policy Dr. Raymond.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems Joint CEA/AEA Evaluation 2005 “Crossing Borders, Crossing Boundaries”
Arlington, VA March 31, 2004 Presentation for the Advisory Committee for Business & Operations Effective Practices Research Overview For Merit Review This.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
August 20, 2008 Clinical and Translational Science Awards (CTSA) CTSA Evaluation Approach Institute for Clinical & Translational Research (ICTR)
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Puget Sound Partnership Action Agenda: Roadmap to Completion.
Connect2Complete Theory of Change Development for Colleges and State Offices November 10, 2011 OMG Center for Collaborative Learning.
Evaluating Research Portfolios I: An Analytical Perspective Impact Evaluation of Energy R&D Portfolios* Rosalie Ruegg, TIA Consulting, Inc.
NSF INCLUDES Inclusion Across the Nation of Learners of Underrepresented Discoverers in Engineering and Science AISL PI Meeting, March 1, 2016 Sylvia M.
Business Strategy and Business Planning. Objectives Examples may include the following: Profit Maximisation Survival Market Growth Corporate Image Environmental.
Tools for Mainstreaming Disaster Risk Reduction: Guidance Notes for Development Organisations Charlotte Benson and John Twigg Presented by Margaret Arnold.
ZeroWIN 3 rd general meeting Southampton, 5-8 July 2010.
Yvonne Abel, Abt Associates Inc. November 2010, San Antonio, TX Enhancing the Quality of Evaluation Through Collaboration Among Funders, Programs, and.
1 “Good Practices in Managing for Results” Workshop Santiago, Chile October 27 th and 28 th, 2010 Benjamin Nelson Managing Director for Quality Office.
November | 1 CONTINUING CARE COUNCIL Report to Forum Year
Alice Pedretti, Project Manager Effective management of complaints for companies Lessons learned from the Management of Complaints Assessment Tool Amsterdam,
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Partnership for Practice
Project Cycle Management
Flag and Logo USAID/Pakistan Alumni Association Discussion on New Directions October 1, 2016.
Input Evaluation the origins of the intervention
Module 1: Introducing Development Evaluation
Claire NAUWELAERS, independent policy expert
Institutional Framework, Resources and Management
Community and Grantee Voice in Evaluation Design
DISTRICT ACCREDITATION QUALITY ASSURANCE REVIEW
Schoolwide Programs.
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

TIA Consulting, Inc. Starting an Evaluation Program: Challenges and Lessons Learned Technology Program Evaluation: Methodologies from the Advanced Technology Program Collaborative Expedition Workshop #71 NSF March 18, 2008 Rosalie T. Ruegg Managing Director, TIA Consulting, Inc.

TIA Consulting, Inc. 2 Outline Starting an evaluation program now Starting ATP’s evaluation program 18 years ago Challenges and lessons learned

TIA Consulting, Inc. 3 Starting an evaluation program now Several points: Public programs are starting evaluation programs ground-up today Two examples from abroad: - Austrian genomic grant program—design of an evaluation plan - Chilean National Innovation System for Competitiveness—design of a multi-level evaluation system.

TIA Consulting, Inc. 4 Steps in designing an evaluation program 1. Identify purposes to be served by evaluation, e.g., To guide program management & strategy - how is the program performing? - are there performance problems? - are there problems with operational efficiency? - are adjustments needed? For accountability and public Policy - is the program doing what it was intended to do? - is it worth continued support? at the same level; at a reduced level? - to provide information for stakeholders, including public opinion - as input to broader public policy

TIA Consulting, Inc. 5 Steps in designing an evaluation program 2. Develop an understanding and model of the program/system to be evaluated, e.g., Review program documents—legislative, operational guidance, descriptive, analytical Interview stakeholders—legislators, program managers, clients, others Conduct studies of underlying program structure Construct descriptive program model(s) and test them with stakeholders

TIA Consulting, Inc. 6 Steps in designing an evaluation program 3. Develop a Program Logic Model

TIA Consulting, Inc. 7 Steps in designing an evaluation program 4. Formulate in detail questions that need to be answered, e.g., for different evaluation purposes, ranging from efficiency questions to ultimate success questions for internal management and for accountability for other diverse audiences, at different time periods, subject to different levels of scrutiny

TIA Consulting, Inc. 8 Steps in designing an evaluation program 5.Develop a plan for answering the questions, e.g., what, how, who, and when: What: what kinds of questions (e.g., descriptive, normative, impact) How: what evaluation methods are needed and what information is needed in support Who: who best to implement—internal versus external, identification of experts When: when will each question need to be addressed and when will planning need to start.

TIA Consulting, Inc. 9 Steps in designing an evaluation program 6. Develop a data collection plan, e.g., What data will be collected Who will collect it How will it be collected How will it be made available to evaluators How will issues of confidentiality (where they exist) be handled Etc…

TIA Consulting, Inc. 10 Steps in designing an evaluation program 7.Identify methods of evaluation (and practitioners) to address different kinds of questions Examples of commonly used methods: Peer review/expert judgment Monitoring, data compilation, use of indicators Bibliometrics Network analysis Historical tracing Case study method Survey method Benchmarking method Benefit-cost analysis Econometric methods

TIA Consulting, Inc. 11 Why multiple evaluation methods? to answer different stakeholders questions to provide alternative perspectives to provide multiple lines of evidence to meet both program management & accountability requirements

TIA Consulting, Inc. 12 Implementation Target audiences “on-board” Internal evaluation component in place Program monitoring and data collection approach in place External evaluation component in place Studies commissioned Coordination Communication of methods and results Feedback loops and adjustments

TIA Consulting, Inc. 13 Development of ATP’s Evaluation Program Conforms to steps just presented, BUT - more evolutionary - finding our way as budget, staff, and insight allowed Why evolutionary? - 18 years ago - predated GPRA and PART - no good comprehensive models--agency evaluation largely piece-meal at the time

TIA Consulting, Inc. 14 Development of ATP’s Evaluation Program Why did ATP’s Evaluation Program emerge as a leader in absence of GPRA and PART? Strong support by ATP’s management for internal use Existing NIST experience with evaluation Legislative requirement for ATP interim evaluation (4 yrs out) ATP’s administrative budget that allowed funding for evaluation (albeit quite small initially) Experimental nature of ATP which meant no status quo and created a climate open to evaluation Perceived need for ever more robust evaluation in face of growing political opposition to ATP Ability to attract leading evaluators to advise and participate in ATP’s evaluation program – interest in public-private partnership programs

TIA Consulting, Inc. 15 Some lessons learned Evaluation is necessary but not sufficient for program survival. Evaluation driven by internal management tends to be more comprehensive than that driven by accountability requirements. There are legitimate roles for both internal and external components of program evaluation. Evaluation is a field that emerged considerably over the past 18 years, but there is still much room for advancement. Other…

TIA Consulting, Inc. 16 Some challenges faced by evaluation Gaining better understanding of evaluation by stakeholders- -its uses and limitations Extending evaluation methods to provide more useful and reliable measures—e.g., for program portfolios Better bridging between the needs and use of evaluation for program management and for accountability, including informing public science policy Other…

TIA Consulting, Inc. 17 Questions/Discussion Questions/Comments/Discussions