Presentation is loading. Please wait.

Presentation is loading. Please wait.

AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Similar presentations


Presentation on theme: "AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:"— Presentation transcript:

1 AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence: The Case of International Agricultural Research

2 Overview/Introduction Use and non-use of impact evaluation: the CGIAR case Douglas Horton & Ronald Mackay, Independent evaluation consultants Towards a broader range of impact evaluation methods for collaborative research: report on a work in progress Patricia Rogers, Royal Melbourne Institute of Technology & Jamie Watts, CGIAR Institutional Learning and Change Initiative Role of Impact Evaluation in Moving from Research into Use Sheelagh O’Reilly, Team Leader, Impact Evaluation, Research into Use Programme

3 Programme Combined presentation Reaction from Robert Chambers, Discussant Q&A and Discussion

4 Use and Non-Use of Impact Evaluation: the CGIAR Case Douglas Horton & Ronald Mackay

5 Overview CGIAR has a long history of producing high-quality impact evaluations However, there has been limited use of findings: –To influence donor / investor decisions & resource allocations –To promote learning & program improvement Use may be enhanced somewhat through better planning and communication, but there remain some inherent problems with all disciplinary-oriented evaluation approaches Other ways of evaluating and fostering learning are needed for social / institutional learning and for policy and program improvement

6 History of IE in the CGIAR High estimated returns to investment in ag. research were key to establishing the CGIAR Hundreds of economic impact assessments report high rates of return CGIAR economists have contributed significantly to improving IA theory & methods

7 From the Studies … “CGI [crop genetic improvement] programmes have been outstanding investments. Few investments can come close to achieving the poverty reduction per dollar expended that the CGI programmes evaluated in this volume have realized… Any reduction in support to agricultural projects, in particular to projects designed to improve productivity, will seriously limit and hamper efforts to reduce mass poverty.” (Evenson & Rosegrant, 2003: 496)

8 The Emerging Paradox “Concern is growing within the donor community relating to the effectiveness of existing impact assessment research in guiding international agricultural research... donor support for agricultural research is declining, despite the credible assessments showing that investment in this area indeed has had high return.” (Gregersen & Morris, 2003: vii) “There is little apparent relationship between impact assessment findings and the subsequent allocation patterns of donors… those areas of research with the highest levels of assessed benefits often suffer from declining funding, while unproven areas of research and non-research investment receive rising funding shares” (Raitzer & Winkel, 2005: ix)

9 Funding to International Agricultural Research (Source: ASTI Initiative)

10 What is Going On Here? Good (impact evaluation) research does not necessarily lead to policy / programme support. Many factors may affect policy & management decisions more than (evaluation) information). For any kind of evaluation to have an impact, use needs to be cultivated from the beginning. One type of IE may not meet all needs

11 Some factors influencing use 1. Engagement of intended users 2. The 4 “I’s” 3. Types and levels of use 4. Attention to use

12 Engagement of Potential/Intended Users  Donors & development agencies  Policymakers  Center / program managers  Researchers  Peers  Constituents / intended beneficiaries

13 Why engage users? Engagement Use of Findings “Process Use” Influence on decision making

14 Four “Is” Interests Ideologies Institutions Information (Weiss, 1998)

15 Types and Level of Use Type of use Direct/ instrumental Indirect/ conceptual Symbolic Decision level Strategic Structural Operational      

16 Attention to Communication  Multiple forms of communication  Match format to audience  Long-term involvement  Integrate evaluation into program  Guard against standardization  Involve stakeholders  Create context for dialogue

17 Suggestions View and manage IE as “evaluation,” not as “research.” 1.Plan and manage evaluations to foster specific uses. 2.Target specific policies and program related issues. 3.Explain how programmes or projects attain results in their context. 4.Use mixed methods from various disciplines as needed to respond to evaluation questions. 5.Judge them for usefulness, practicality, respect for propriety and accuracy of data and results

18 Towards a broader range of impact evaluation methods…Why? Agricultural research has expanded into a broader range of areas –From crop improvement to higher level development goals Role of the researcher in the agricultural innovation system is changing –From center of excellence to collaborative and capacity building approach –From transfer of technology to demand driven, locally relevant solutions Traditional evaluation designs may not always be feasible or appropriate

19 Increasingly diverse portfolio Impact assessment of genetic improvement of major crops well represented Somewhat represented biological control of pests Under represented in IA portfolio: –crop and integrated pest management –livestock –natural resources management –post harvest technologies –policy and gender research

20 Increasingly collaborative research Source: Douthwaite 2004.

21 Increasing demand to engage intended end- users: Increase researchers’ understanding of local issues to improve the relevance of research to local conditions Increase uptake and appropriate adaptation Incorporate local knowledge into research Co-production of knowledge by researchers and community members Develop end-users’ capacity to build and use knowledge for adaptive management

22 Spectrum of participation Conventional research: scientists make the decisions alone without organized participation by end-users Contractual: scientists contract with end-users to participate. Consultative: scientists make decisions but with organized communication with end-users Collaborative: decision-making authority is shared between end-users and scientists. Neither party can revoke or override a joint decision. Collegial: end-users make decisions collectively either in a group process or through individual end-users who are in organized communication with scientists. End-user experimentation: end-users make the decisions without organized communication with scientists. (adapted from Lilja and Ashby) Scope of this work `

23 Nature June 2008 Special issue on translational research

24 Conceptualising translational research [Nobel laureate Sydney] Brenner is one of many scientists challenging the idea that translational research is just about carrying results from bench to bedside, arguing that the importance of reversing that polarity has been overlooked. “I’m advocating it go the other way,” Brenner said.

25 Following a Recipe A Rocket to the Moon Raising a Child Formulae are critical and necessary Sending one rocket increases assurance that next will be ok High level of expertise in many specialized fields + coordination Rockets similar in critical ways High degree of certainty of outcome Formulae have only a limited application Raising one child gives no assurance of success with the next Expertise can help but is not sufficient; relationships are key Every child is unique Uncertainty of outcome remains ComplicatedComplex The recipe is essential Recipes are tested to assure replicability of later efforts No particular expertise; knowing how to cook increases success Recipes produce standard products Certainty of same results every time Simple (Diagram from Zimmerman 2003)

26 SimpleComplicatedComplex Deciding impacts Likely to be agreed Likely to differ, reflecting different agendas May be emergent Describing impacts More likely to have standardised measures developed Evidence needed about multiple components Harder to plan for given emergence Analysing cause Likely to be clear counter- factual Causal packages and non-linearity Unique, highly contingent causality Reporting Clear messages Complicated message Uptake requires further adaptation

27 The need for broader range of methods Complement existing methods for Impact Evaluation (raising issues of multidisciplinary and mixed methods) Identify, describe, measure and value impacts Assess causal inference in collaborative and/or participatory projects Support the use of impact evaluation for learning and adaptive management

28 Entry Points for learning and change Knowledge, skills & attitudes –People need to want to learn and know how to engage partners in co-creation of knowledge Management systems & practices –Leaders learn, value learning, and promote learning in concrete ways –Communication channels facilitate easy access to information and knowledge sharing –Systems and structures facilitate learning Organizational culture –Supports and rewards reflection & learning and the application of lessons External environment –Is conducive to reflection and learning from experience

29 Visualising the connection between laboratory research and practice research Tabak, 2005 National Institute of Dental and Crano-Facial Research, National Institutes of Health

30 Capacity for organizational learning Systematically gathering information Making sense of information Sharing knowledge and learning Drawing conclusions and developing guidelines for action Implementing action plans Institutionalizing lessons learned and applying them to new and on-going work

31 Research Into Use Programme How can innovation-system approaches promote and facilitate greater use of research-based knowledge –Maximise the poverty-reducing impact of previous research on natural resources –Develop understanding of how innovation- system approaches contribute to reducing poverty whilst ensuring effective and efficient management of natural resources. Challenges to impact evaluation –need to identify critical success factors –coherent approaches for spotting ‘potential winners’ among research outputs, in the move from research into innovation –mainstream use of new technologies that contribute to poverty reduction and economic growth.


Download ppt "AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:"

Similar presentations


Ads by Google