Investment Logic Mapping – An Evaluative Tool with Zing

Slides:



Advertisements
Similar presentations
Project Cycle Management
Advertisements

The matrix Standard. Welcome Mark Wem emqc International Strategic Associate & matrix Assessor Dubai 15 th October 2012.
Building blocks for adopting Performance Budgeting in Canada Bruce Stacey – Executive Director Results Based Management Treasury Board Secretariat, Canada.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
Program Performance Reporting and Evaluation in Australia Mark Nizette Department of Finance and Administration October 2001.
Overview of UNDAF process and new guidance package March 2010 u nite and deliver effective support for countries.
Evaluation in the GEF and Training Module on Terminal Evaluations
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
Challenges & Strategies for Success Challenges & Strategies for Success Lewe Atkinson Manager, Evaluation and program improvement Evaluator as change agent.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
AGRO PARKS “The Policy Cycle” Alex Page Baku November 2014.
WORKSHOP ON PROJECT CYCLE MANAGEMENT (PCM) Bruxelles 22 – 24 May 2013 Workshop supported by TAIEX.
1 Chapter 9 Implementing Six Sigma. Top 8 Reasons for Six Sigma Project Failure 8. The training was not practical. 7. The project was too small for DMAIC.
Infrastructure Delivery Management Toolkit:
Procurement Development Programs
Knowledge for Healthcare: Driver Diagrams October 2016
Technical Business Consultancy Project
Building evaluation in the Department of Immigration and Citizenship
Well Trained International
Resource 1. Involving and engaging the right stakeholders.
14th CAS meeting Performance reporting Presentation by SAI-SA
Gathering a credible evidence base
SAMPLE Drive Engagement Through Interdepartmental Collaboration
Thursday 2nd of February 2017 College Development Network
New Zealand Disability Strategy
Project Management and Monitoring & Evaluation
SAMPLE Develop a Comprehensive Competency Framework
BUMP IT UP STRATEGY in NSW Public Schools
Supporting Learning and Organisational Effectiveness
Select and Implement an ESB Solution
Part 2: How to ensure good project management?
SYSTEMS ANALYSIS Chapter-2.
Local economic development agency
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
TSMO Program Plan Development
Statistical Training Framework based on the GSBPM
Policy Lessons Learned
Asset Governance – Integrated Strategic Asset Management
Draft OECD Best Practices for Performance Budgeting
Background to The Conference
PEMPAL – Facilitating Practical Solutions in PFM across ECA
BAI Gender Action Plan 27th April 2018 IFI - Spotlight Stephanie Comey.
Evaluation in the GEF and Training Module on Terminal Evaluations
Project Management Process Groups
Helene Skikos DG Education and Culture
Professional development
ACT Evaluation Capability We’ve talked the talk
Employee engagement Delivery guide
February 21-22, 2018.
Strategy
Explorative Stakeholder Dialogue
Define Your IT Strategy
Finalization of the Action Plans and Development of Syllabus
An Introduction to LiFE
KEY INITIATIVE Financial Data and Analytics
Accounting Discipline Overview
Notes: Rapid assessments.
Integrating Gender into Rural Development M&E in Projects and Programs
Reviewing RIS3 in Catalonia
WORK STREAM TEAM DELIVERABLES
Strategic Management and
Strategic Management and
Presentation transcript:

Investment Logic Mapping – An Evaluative Tool with Zing Canberra Evaluation Forum – 16 September 2010 Anna Georgalis Department of Innovation, Industry and Regional Development, Victoria

Today’s presentation About the Victorian Department of Innovation, Industry and Regional Development (DIIRD) Evaluation activity at DIIRD The Victorian Department of Treasury and Finance (DTF) Investment Management Standard and Investment Logic Maps (ILMs) An evaluation approach using ILMs Results of the approach

About DIIRD DIIRD is the Victorian Government’s lead agency for economic and regional development The Department develops and implements a diverse range of programs, initiatives and projects designed to: attract and facilitate investment; encourage exports and industry development; foster skills development; stimulate innovation; and promote Victoria nationally and internationally Benefits to the community: increased investment, exports and high-quality jobs

Evaluation at DIIRD Approach defined by the DIIRD Evaluation Policy, signed off by the Secretary Small team of four full time equivalent staff (Evaluation Unit) Annual evaluation plan developed

Functions of the Evaluation Unit Conduct policy and program evaluations Conduct triple bottom line evaluations (economic, social and environmental) of major events funded by the Victorian Government Build evaluation capability across DIIRD Provide advice and support on evaluation strategies and projects

Continuous Improvement Cycle

Benefits of evaluation Supports continuous improvement of programs, initiatives and policies Provides evidence for business case development for funding proposals Strengthens reporting to DIIRD Executive, Central Agencies and Parliament

Scenario in the early days… Evaluation reports were not ‘hitting the mark’ with stakeholders The scope of the program being evaluated was not clear The evaluation reports were not evidence based and lacked quantitative data to support findings and recommendations Consultants undertaking evaluations were not being given clear guidance The necessary data for the evaluation were not available

The evaluation approach promoted at DIIRD Involve stakeholders throughout the process Clearly articulate the outputs and desired outcomes of a policy, program or initiative that is/will be evaluated Use Investment Logic Maps (ILMs)

The evaluation approach promoted at DIIRD (continued) Define evidence and data required for effective monitoring and evaluation: Develop a performance monitoring framework early in the life cycle of a program to collect the relevant data; or Develop an evaluation framework that guides the scope of an evaluation and determines what data will need to be collected to ensure a rigorous evaluation (is usually provided to consultants)

The evaluation approach promoted at DIIRD (continued) Frameworks will contain: Evaluation questions that are aligned with the elements of the ILM Performance measures designed to support evaluation questions Data to support the performance measures Tools that will be used to gather data Surveys and interviews that are aligned with the performance measures The evaluation report is structured on the framework

The Investment Management Standard Developed by the Victorian Department of Treasury and Finance (DTF), the standard has been evolving since 2004 Lightweight common-sense practices that have assisted departments improve the way they develop and manage investments The centrepiece of the standard is the ILM

The Investment Management Standard

Investment Management compared with Project Management Will the project complete within budget? Will it deliver to its planned schedule? Were the expected products delivered? INVESTMENT management Is the logic for the planned investment clear? Is there a sound business case to proceed? Were the expected benefits achieved?

The Investment Logic Map (ILM) An approach that engages with the Investor and key stakeholders to shape the investment decisions in a series of ‘intelligent discussions’ Structured two-hour workshops (generally up to three, depends on complexity and maturity of proposal) Facilitated by people who are skilled in the approach who test the rigour and evidence behind the investment

ILM (continued) One page graphical representation of the logic of an investment proposal that contains: The problem being addressed The intervention (what Government wants to do) The benefits for the community The solution including the necessary changes and assets needed

Example of ILM

Differentiation with DTF approach DTF uses outputs for their changes We promote the use of intermediate outcomes or critical success factors that need to be achieved to deliver on the interventions

An example…

ILMs in practice Business cases Policy development Problem solving Strategic planning Evaluation and monitoring activities

ILMs and monitoring and evaluation Performance monitoring framework at the commencement of a program or An evaluation framework for evaluation activity

Benefits of using ILMs Shared understanding of an issue through informed intelligent discussion A concise and rigorous argument to solve a problem The most appropriate solution to an issue Clear and accountable description of a policy or program

The evaluation approach in detail An ILM is developed and signed off by the Program Manager Frameworks are developed by applying evaluation questions against each element of the ILM around: Appropriate/relevance Effectiveness Efficiency Lessons learned Future directions Performance measures are then developed that will support answering the evaluation question Data sources and inputs are also determined

The approach in practice I will now provide an example of how the Investment Logic Map, the Evaluation Framework and the Collection of Evidence (in this case the survey) all link together Investment Logic Map Evaluation Framework Evidence (survey)

The approach in practice (continued) CLICK – taking Benefit number 2 from the Investment Logic Map. CLICK – this was “Increased, investment, exports and jobs in regional Victoria” CLICK – now this is an extract of our Evaluation Framework in regard to this Benefit CLICK – our key evaluation question in regard to this benefit is “To what extent has the RICP increased investment, exports and jobs in regional Victoria” CLICK – this is an effectiveness questions as denoted by this “ES” CLICK – focusing in on the “investment” aspect CLICK – one of our performance measures to answer this part of the question is “Number and value of new investments generated – the “B2” in brackets denotes this performance measure is linked to Benefit 2 – there are other performance measures in relation to this benefit and key evaluation question, but we will focus on this one here CLICK – our data source or mechanism for collecting evidence for this performance measure is the “RICP participant survey” CLICK – so this is an excerpt from our participant survey – I will stop here for a moment and mention there are a number of questions asked up front in the survey to allow us to filter and identify what cluster the survey response is in relation to and if the response is from business or industry, educational institutions, government etc. We surveyed cluster participants on their level of agreement as to whether the program stimulated jobs, investments and exports (these are the yellow boxes which inform another key evaluation question) and asked them to quantify actual jobs, investments and exports generated where appropriate (these are the green shaded boxes, which are relevant to this example) CLICK – so the survey asks the respondent to quantify the number and value of new investments generated for their organisation through the cluster this is Question 5e (number and value) CLICK – and this is linked back to the Evaluation Framework. CLICK – by undertaking this process for each Investment Logic Map element you get the data or evidence CLICK in this case the number and value of new investments CLICK – from the survey (or evidence collection mechanism) CLICK – to inform the performance measure CLICK – to answer the key evaluation question CLICK – which is linked to the Investment Logic Map element

A framework is the key quality assurance step in the approach Survey and interview questions are aligned with performance measures and evaluation questions Ensure all necessary data is collected for the evaluation Frameworks guide the content of the evaluation report

Advantages of the approach Stakeholders are engaged and sign-off at key points Scope of program and evaluation clearly articulated Appropriate evidence and data is collected Clear guidance available to consultants especially through the ‘request for quote’ process

Results from the approach Evaluation reports ‘hitting the mark’ with stakeholders Greater engagement of stakeholders Better defined scope More data available and evidence based reports Consultants delivering a higher standard of reports Better and more robust recommendations to support continuous improvement

Summary Prepare an ILM or equivalent Develop a Performance Monitoring Framework or Evaluation Framework Follow it through!

Thank you Questions?