GENEVA EVALUATION NETWORK WORKSHOP CONFERENCE EVALUATION Organized by Laetitia Lienart & Glenn O’Neil Geneva, 16 March 2011.

Slides:



Advertisements
Similar presentations
Quality & Impact in DE defined: a selection
Advertisements

Measuring Outreach Effectiveness Web4Dev – November 2006 Alex McKenzie - Knowledge & Evaluation Capacity Development Independent.
Case Studies – Australia Ross Attrill – International IDEA.
Preliminary Findings of USAID Aid Transparency Study October 15, 2014.
Valuing evaluation beyond programme boundaries: Communicating evaluations to enhance development effectiveness globally Anna Downie
Identification of critical success factors for implementing NLLS, through collaboration and exchange of expertise IDENTIFY LLP-2008-RO-KA1-KA1NLLS.
Developing a Strategic Communications Plan. Overview This session will cover how to: Outline team functions and chain of command Identify key stakeholders.
April 29, 2014 Gail A. Epps, Ed. D. Program Manager.
RBM Communications Assessment Challenges and Opportunities in Ghana, Mali, Senegal, Tanzania and Uganda.
Return On Investment Integrated Monitoring and Evaluation Framework.
Comprehensive M&E Systems
Idasa – Governance and AIDS Programme Building a habit of citizen action through HIV and AIDS Communication.
Evaluation. Practical Evaluation Michael Quinn Patton.
Student Assessment Inventory for School Districts Inventory Planning Training.
EuropeAid’s today’s quality framework and the Results agenda
System-wide Action Plan for implementation of the CEB Policy on gender equality and the empowerment of women: briefing UN Women Coordination Division.
Cross Border Animal Health Plan of Action – Kenya and Uganda Four Strategic areas 1. To improve prevention, management and control of cross border animal.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Planning and submitting a shadow report Charlotte Gage Women’s Resource Centre.
Development and Implementation of a National Multisectoral Output Monitoring System (SHAPMoS) for HIV Responses in Swaziland:  Challenges and lessons learned.
Copyright © 2014 by The University of Kansas Using Social Media for Digital Advocacy.
CONFERENCE EVALUATION PLANNING Good planning is essential ! (‘’fail to plan is plan to fail’’)
THE OECD APPROACH TO ASSESSING ORGANISATIONAL EFFECTIVENESS Frank van Tongeren Head of Division, Policies in Trade and Agriculture (OECD) ICAE pre-conference.
Evaluation and Assessment October 13, 2009 Diane Schilder, Project Evaluator Schilder, Can be used with attribution.
1 Informing a Data Revolution Getting the right data, to the right people, at the right time, on the right format Johannes Jütting, PARIS21 Tunis, 8 Decemeber.
Natural Disasters – Are We Prepared as Rescuers? Omer Faheem.
Outcome Based Evaluation for Digital Library Projects and Services
Skills Online: Building Practitioner Competence in an Inter-professional, Virtual Classroom Canadian Public Health Association 2008 Annual Conference.
Making an Impact- Creu Argraff WELSH MUSEUMS FESTIVAL TRAINING – 10 AND 11 SEPTEMBER 2015 HYFFORDDIANT GWYL AMUGEDDFEYDD CYMRU – 10 A 11 MEDI 2015.
Why Use New Media for Organizing?. Why use new media for organizing?
Developing a Social Media Process for your Business Alyn Stafford 1 Monitor and Listen Explore and listen to what others.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Formative Evaluation of UNGEI Findings and Selected Recommendations Presentation to UNGEI GAC 14 February 2012.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
1 WERT: WP 5 RG EVANS ASSOCIATES November 2010 Aim To pilot and evaluate the content and context of the course material with target groups To help women.
Civil society participation in the Universal Periodic Review.
CONFERENCE EVALUATION PROMOTION Evaluation needs to be visible and attractive!
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
UNECE Statistical Division Slide 115 May 2008 Open Discussion Work Session on Statistical Dissemination and Communication May 2008 Facilitator: Gina Pearson.
FITT Fostering Interregional Exchange in ICT Technology Transfer Communication & Collaboration Tools.
Presented by Madhuriya Kumar Dutta Trade and Investment Facilitation Department Mekong Institute, Thailand 16 May 2012.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Paul Griffiths and Roland Simon Wrap-up presentation What has the EMCDDA learned ?
CIRTL Network Data Collection 3/2/2013. Institutional Portrait: Purpose Consistency with the TAR principle Accountability: – Helps us all monitor Network.
Exploring Evidence.
The Next Stage for Results in Africa. Context 2005 Paris Declaration on Aid Effectiveness 2006 Mutual Learning Events Uganda & Burkina Faso 2007 Hanoi.
African Centre for Statistics United Nations Economic Commission for Africa Proposed Framework for Monitoring, Evaluation and Reporting Negussie Gorfe.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
UNICEF-supported Global Pilot School Sanitation & Hygiene Education (SSHE) Project Participatory Assessment Sharing Workshop, 6-10 March 2006 Presentation.
ESF Networking in the UK and at the Community level James Ritchie Information Officer – England and Gibraltar ESF programme.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning a Statistical Project Section B 1.
Modernising Statistical Production: Modernising Statistical Production: Main recommendations from global assessments 7 th SPECA PWG on Statistics
Under construction SPANISH PRESIDENCY OF THE EU 2010 FRAMEWORK CONDITIONS ROADMAP 4-May-2010.
Comprehensive M&E Systems: Identifying Resources to Support M&E Plans for National TB Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May.
M & E System for MI’s Training Program & Guidelines for MI’s Completion Report Presented by Monitoring and Evaluation Officer Mekong Institute January.
+ Welcome to PAHO/WHO Sustainable Development and Health Toolkit for the UN Global Conference RIO + 20 Welcome to PAHO/WHO Sustainable Development and.
Regional Priorities for Implementation of the 2030 Agenda Statistics and mainstreaming of the SDGs to address vulnerability.
State Development Information and tips to develop the Annual Work Plan 1.
4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone.
Communicating Actionable Regulatory Intelligence João Duarte Regulatory Intelligence Strategy Leader H. Lundbeck A/S.
Work Package 2 Eva Heckl 1/10/2014 Feasibility study on an internet-based e-platform for women entrepreneurs.
Governor Support Service Training Governor Workshop 31 st March 2016 As a service we have a responsibility to enable all governors to access appropriate,
Planning for Research Uptake through a Research Communication Strategy (ResCom)
FUNCTIONAL SKILLS REFORM PROGRAMME CONSULTATION JANUARY – JULY 2016.
Exploitation means to use and benefit from something. For Erasmus+ this means maximising the potential of the funded activities, so that the results are.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
UN Support to SDG implementation in Seychelles.
Introduction on the outline and objectives of the workshop
WHAT is evaluation and WHY is it important?
Comprehensive M&E Systems
Presentation transcript:

GENEVA EVALUATION NETWORK WORKSHOP CONFERENCE EVALUATION Organized by Laetitia Lienart & Glenn O’Neil Geneva, 16 March 2011

WHY IS IT IMPORTANT TO EVALUATE CONFERENCES? ESSENTIAL FOR… Institutional Memory Continuous Learning/ Improvement Accountability

WHAT ARE THE OBJECTIVES OF CONFERENCE EVALUATIONS? ASSESS PROCESS Governance Programme Logistics & onsite support Information/ Communication IMMEDIATE OUTCOMES Reactions Learnings/ Benefits Applications IMPACTS (lasting changes) At individual level At organizational level At country level

RELEVANT DATA COLLECTION METHODS Face-to-face or phone individual interviews (structured & semi-structured) Focus group interviews Online surveys Printed surveys Structured observations of key sessions and conference areas Review of conference programme and online resources Review of statistical data on conference registration, scholarship recipients, abstracts, etc Review of statistical data and evaluation findings from previous conference to allow comparison over time

Use of rapporteurs to follow sessions addressing key topics. Their feedback can be also used to measure some indicators (e.g. number of sessions presenting new findings). Use of conference “instant” feedback systems. Use of network analysis and mapping. Analysis of the conference media coverage. Review of posts and comments left by delegates and non-attendees on the conference blog, Facebook page and Twitter. RELEVANT DATA COLLECTION METHODS (cont.)

Focus on IMPACT ASSESSMENT Assessing conference impact(s) is feasible but needs to be planned and budgeted for at the planning stage (incl. in ToRs) Methods: follow-up survey (online/face-to-face), action plans  1,195 AIDS 2008 delegates completed the survey  About 2/3 had learnt something new and had changed some aspects of their work practice thanks to the new knowledge gained at the conference  Almost half reported that AIDS 2008 had directly influenced their organizations’ HIV work  Almost 4 in 10 were aware of AIDS 2008’s influences on HIV work, policies or advocacy in their countries  75% had kept in contact with at least 1 person met at AIDS 2008, mainly to exchange knowledge, lessons learnt and/or suggested solutions (86%) Ex: AIDS 2008 follow-up survey (1,5 year after)

USE OF EVALUATION FINDINGS Evaluation findings should be “very usable” as conferences are often repeated annually or bi-annually. Importance of “buy-in” of conference organizers. Sharing of evaluation plan with conference organizers and committees/working groups*. Evaluation reports: the quality of content and format is crucial to attract readers and convince them that evaluation results are reliable and useable. Dissemination of evaluation results: timely, use a variety of channels depending on the target audience. Use of follow-up mechanisms** with conference organizers and relevant stakeholders. Review progress on evaluation findings in the lead-up to the next conference.

KEY LESSONS LEARNT 1.Over-positive feedback (new strategy to be tested in 2011). 2.Evaluation report more used as an accountability & marketing tool rather than for learning purposes. 3.Unwillingness of conference organizers to devote adequate human & financial resources to evaluation. 4.Impact assessment remains a challenge (difficult to measure the extent to which changes are attributable to the conference, especially policies, norms & guidelines). 5.Data disaggregation is important to make evaluation results more accurate and useful*. 6.Conference evaluation provides unique opportunity to see how findings are integrated (or not) into future conferences.

Further information Proceedings (slides & handouts) of a 1-day workshop on conference evaluation held in Nov 2010 are available on request ( Feel free to join the Conference Evaluation Google Group: Glenn’s blog has more resources on conference evaluation, see category “event evaluation: