The Evolution of Evaluation Practices in the Canadian Federal Government 1980 - 2005 Presented at CES / AEA Conference Toronto, Ontario, Canada October.

Slides:



Advertisements
Similar presentations
WCDR Thematic Panel Governance: Institutional and Policy Frameworks for Risk Reduction Annotated Outline UNDP – UNV – ProVention Consortium – UN-Habitat.
Advertisements

Measuring innovation: Main definitions - Part II South East Asian Regional Workshop on Science, Technology and Innovation Statistics.
Measuring innovation South Asian Regional Workshop on Science, Technology and Innovation Statistics Kathmandu, Nepal 6-9 December 2010.
John H. Neate, Senior Associate, OCETA/ETV Canada, Canada Boosting Eco-technologies through Verification Expectations Parallel Sessions – Part I: Scope.
Local Immigration Partnerships: Systems Planning to Help People.
Building blocks for adopting Performance Budgeting in Canada Bruce Stacey – Executive Director Results Based Management Treasury Board Secretariat, Canada.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Excellence with Impact Declan Mulkeen January 2011.
Donald T. Simeon Caribbean Health Research Council
1 Service Providers Capacity Assessment Framework Presentation to the Service Delivery Advisory Group August 28, 2008.
Labour Market Planning LMDA Service Delivery Advisory Group September 28, 2006 CONFIDENTIAL – NOT FOR DISTRIBUTION.
Queensland Treasury Department Role and Function of Treasury Financial Framework Charter of Fiscal and Social Responsibility and Priorities in Progress.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Ray C. Rist The World Bank Washington, D.C.
1 Financial Management Institute (FMI) Being an FMA – Areas to Pay Attention To Nathalie Deslauriers, CPA, CGA Financial Management Advisor – Policy Group,
Presented to: AEA Evaluation Conference November 2, 2006 Presented by:
Evaluation. Practical Evaluation Michael Quinn Patton.
Knowledge Translation: A View from a National Policy Perspective KU-02 Conference Oxford, England July 2, 2002.
Resource Allocation in Canada Evaluation, Accountability and Control Brian Pagan Expenditure Operations and Estimates Treasury Board of Canada Secretariat.
Local Immigration Partnerships. Overview  What are the challenges that Local Immigration Partnerships are intended to address?  Background on the development.
OPTIONS AND REQUIREMENTS FOR ENGAGEMENT OF CIVIL SOCIETY IN GEF PROJECTS AND PROGRAMMES presented by Faizal Parish Regional/Central Focal Point GEF NGO.
EVALUATION IN THE GEF Juha Uitto Director
What is Business Analysis Planning & Monitoring?
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
From Conformance to Performance: Using Integrated Risk Management to achieve Organisational Health Ms Stacie Hall Comcover National Manager.
Reform and change in Australian VTE and implications for VTE research and researchers By Aurora Andruska 20 April 2006.
Evaluation in the GEF and Training Module on Terminal Evaluations
SOCIAL DEVELOPMENT CANADA 1 The Government of Canada and the Non-Profit and Voluntary Sector: Moving Forward Together Presentation to Civil Society Excellence:
Leader One Inc1 Performance Audit: ‘ Managing for Results’ to ‘Value for Re$ources’ ICGFM Miami Conference & Training 2008 May 22, Jean-Baptiste.
VisualConnect™ Waterfield© Strategic Planning Workshop Wednesday 4 th November 2009.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
A Proposal to Develop a Regulatory Science Program under Carleton University’s Regulatory Governance Initiative Presentation to the fourth Special Session.
Expenditure Management Information System GTEC October 2004 emis RDIMS
AWARE: Stakeholder Analysis Udaya Sekhar Nagothu, Per Stålnacke, Bioforsk, Norway. AWARE kick-off meeting Rome, 3-5 June, 09.
OECD World Forum “Statistics, Knowledge and Policy”, Palermo, November
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Identification of national S&T priority areas with respect to the promotion of innovation and economic growth: the case of Russia Alexander Sokolov State.
Characteristics of a National Innovation System
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
The Role of Government in Building Absorptive Capacity Ken Warwick DTI Knowledge Economy Forum VI 17 April 2007.
An R&D Manager’s Perspective TechExpo October 5, 2004 Presented by: Veena Rawat.
PP 4.1: IWRM Planning Framework. 2 Module Objective and Scope Participants acquire knowledge of the Principles of Good Basin Planning and can apply the.
1 Transport Canada Transports Canada Presentation for CES - Conference 2000 Presented by Jennifer Birch-Jones, Evaluation Manager Gail Young, Evaluation.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Training Resource Manual on Integrated Assessment Session UNEP-UNCTAD CBTF Process of an Integrated Assessment Session 2.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Reallocation in the budget process Strategic Reviews around the world Cutting Tools: How to Cut Risks, consequences, sustainability Practical Considerations.
NSDS DESIGN PROCESS: ROAD MAPS & OTHER PRELIMINARIES Prof. Ben Kiregyera NSDS Workshop, Addis Ababa, Ethiopia 9 August 2005.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
1 The Use of Surveys in Determining Outcomes of Research, Technology and Development Programs Presented at:Joint CES / AEA Conference Presented.
Kathy Corbiere Service Delivery and Performance Commission
Joint Priority Project #2: Service Visions and Mapping Presentation to PSSDC/PSCIOC Winnipeg, Manitoba, September 28, 2004 By: Industry Canada Ontario.
Examination of Multi-department (Horizontal) S&T Programs Presented to:AEA Evaluation Conference November 2, 2006 Presented by:Steve Montague and George.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Agriculture in Australia Agriculture in Australia utilises a large proportion of the country’s natural resources. Agricultural activity is undertaken on.
Monitoring and Evaluating Rural Advisory Services
Evaluation : goals and principles
Using Logic Models in Program Planning and Grant Proposals
Strategic Planning for Learning Organizations
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Draft OECD Best Practices for Performance Budgeting
Evaluation in the GEF and Training Module on Terminal Evaluations
Resourcing Consumer Engagement
Presentation transcript:

The Evolution of Evaluation Practices in the Canadian Federal Government Presented at CES / AEA Conference Toronto, Ontario, Canada October 29, 2005 Presented by:George Teather Tel:(613) Fax:(613)

PRESENTATION OUTLINE u General overview of evolution of Canadian federal government evaluation practices u Description of use of evaluation in program management u Review of challenges u International comparisons 2

Review of Evaluation Practices u Long standing practice, experience with evaluation, as the formal requirement for evaluation began by 1980 u Evaluation groups are in Departments, with central agency overview, guidelines and some workshops u Private sector consultants, with evaluation and performance measurement expertise, conduct many of the studies under the oversight of Departmental staff u Evaluation has incorporated key characteristics ä Use of logic model to describe the program ä Generic issue groups Ô Relevance Ô Objectives achievement Ô Alternatives (design and delivery) ä Multiple lines of evidence to increase validity and credibility u In mid 1990s, periodic, indepth evaluation and ongoing performance measurement became more closely linked 3

Integration of Evaluation and Performance Measurement with Management Practices u Results-based Management and Accountability Framework (RMAF), introduced in 2001, provides an integrated approach to annual performance measurement and periodic evaluation ä Logic model, description of program ä Performance measurement strategy, indicators, sources of info ä Evaluation issues, methodological approach ä Accountability, reporting requirements and schedule u Departmental management practices incorporate RMAF concepts and information ä Key results commitments describe objectives ä Annual Departmental Report on Plans and Priorities identifies annual plans, intended results ä Annual Departmental Performance Report provides information on annual outcomes and achievements ä More recently, Program Activity Architecture has been introduced, which identifies key results linked to Departmental objectives and indicators to demonstrate level of achievement and performance 4

How is Evaluation Used? What is Its Impact on Policy Development and Implementation? u Evaluation is embedded in management practices ä RMAF is required for all new and renewing contribution programs and recommended for others ä Annual performance measurement and reporting is required ä Evaluation studies are usually required before renewal of funding u However, evaluation is only one of many inputs to decision making ä Political imperatives ä Policy groups, stakeholder influence u Location of evaluation in Departments leads to: ä Use to describe program benefits, and defend programs from central agencies ä Use to adjust program design and delivery in some cases, almost never call for cancellation of program u Because existing programs are being evaluated, evaluation has a greater role in policy implementation than policy development u Logic model has been useful in program design and development as well as evaluation and performance measurement 5

What are the Most Compelling Evaluation Challenges? u Main challenges are closely linked: ä engaging senior management’s attention and interest in understanding and utilizing evaluation results ä educating policy analysts and central agencies as to what can be expected from evaluation and performance measurement ä producing relevant, credible evaluation studies u There is a long standing issue of linking policy development and implementation with scientific evidence ä Council of Science and Technology Advisors produced a report on Scientific Advice for Government Effectiveness (SAGE) that identified ways to improve linkages and improve consideration of evidence ä Difficult to undertake horizontal, multidepartment evaluations due to location of evaluation expertise ä Most S&T policy emphasis is linked to economic policy, with S&T as a driver for economic growth, less consideration of the role of S&T in social policy and public good u Government has cut back on training and building evaluation capability in departments specific to the needs of government, with a negative effect on expertise, quality of work ä Some evaluation studies identify results and benefits that have limited attribution to the program and don’t identify other contributors ä There is no regular independent / central agency review of the quality and credibility of evaluation reports 6

Tracking National Performance – What are the Correct Indicators for International Comparisons? u Comparisons require international agreement on indicators, access to credible, consistent data u Standard comparisons include: ä Resources Ô GERD, (funding by sector, contribution of government, industry)) Ô Highly qualified personnel (engineers, scientists, PhDs) ä Outputs Ô Scientific publications (English journals) Ô Patents u Example for Biotechnology ä Statistics Canada has developed an international standard for the measurement of biotechnology activities that is being adopted by OECD countries, including U.S. u More work needs to be done on identifying indicators, developing data sources ä Need the support of Statistics Canada, other agencies ( RSA) ä Comparison of national innovation systems, public / private sector interactions 7

Most Promising Evaluation Methods u There has been a general realization of the need to extend the logic model concept to explicitly identify: ä the group or individuals (participants, clients, beneficiaries) that the program intends to influence to change their behaviour in order to achieve objectives ä The partners, collaborators and stakeholders whose participation and support are required for the program to fully succeed u In a given program, there may be different groups at different stages of the program, or participating for different purposes, objectives ä Research – university, gov’t scientists ä Development – engineers, innovative firms ä Commercialization – venture capital, banks, companies u Examination of actual compared to intended clients of a program is an important issue ä Profile of clients and participants u Identification of key partners and their level of involvement is also important in examining the “pathway to success” ä Network analysis ä Analysis of roles and relationships 8

Generic Program Logic Model* Program Objective: high level strategic purpose HOW? WHO / WHERE? WHAT do we want? WHY? ResourcesReachResults Program / Service Delivery Client Management Policy & Issue Management Financial Management Human Resources Management Asset Management Program deliverables Policy guidelines, regulations Communications -plans - internal communications - promotion - info transfer - consultations - meetings/events Funding Service Outputs Primary Targets (clients, ultimate beneficiaries) Co-delivery Agents Other Stakeholders activitiesoutputs users / clients / co-deliverers / beneficiaries direct outcomes intermediate outcomes ultimate impacts Client Service - addresses needs - meets / exceeds expectations - service quality Behavioral Influence -awareness -understanding -attitude / perception -support New knowledge Improved capability Improved decision making Target group changes in behaviour / other outcomes Sector / Industry / Regional Impact Economic/ Environmental/ Societal Impact Contribution to organizational objective *reference 9

Operational Your operational environment You have direct control over the behaviours within this sphere Behavioural Change Your environment of direct influence e.g.people and groups in direct contact with your programs, staff (i.e. clients, target audience, co-delivery partners State Your environment of indirect influence e.g., Industrial sectors, government decision makers, other communities of interest where you do not make direct contact Spheres of Influence* Time * reference S. Montague, 10

References u Treasury Board RMAF and other evaluation policies and guidelines u Council of Science and Technology Advisors SAGE and other reports u Evaluation and performance measurement courses and consulting services 11