1 of 20 Evaluating an Information Project From Questions to Results © FAO 2005 IMARK Investing in Information for Development Evaluating an Information.

Slides:



Advertisements
Similar presentations
Chapter 3: Clinical Decision-Making for Massage
Advertisements

Planning Reports and Proposals
Introducing Instructional Expectations
Planning an Onsite Training Seminar Presenter Name #/#/200# Note: this project plan was exported from Project KickStart directly to PowerPoint. Project.
Edition Vitale and Giglierano Chapter 6 Assessing and Forecasting Markets Prepared by John T. Drea, Western Illinois University.
1 of 13 Organization and Management Information Management in Your Organization IMARK Investing in Information for Development Organization and Management.
1 of 18 Information Dissemination Products and performance IMARK Investing in Information for Development Information Dissemination Products and Performance.
1 of 20 Information Dissemination Audiences and Markets IMARK Investing in Information for Development Information Dissemination Audiences and Markets.
1 of 18 Information Dissemination New Digital Opportunities IMARK Investing in Information for Development Information Dissemination New Digital Opportunities.
1 of 21 Information Strategy Developing an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy Developing.
1 of 19 How to invest in Information for Development An Introduction IMARK How to invest in Information for Development An Introduction © FAO 2005.
1 of 17 Information Strategy The Features of an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy The.
1 of 18 Information Access Introduction to Information Access © FAO 2005 IMARK Investing in Information for Development Information Access Introduction.
1 of 14 Information Access Management Interventions © FAO 2005 IMARK Investing in Information for Development Information Access Management Interventions.
1 of 18 Evaluating an Information Project Getting Ready © FAO 2005 IMARK Investing in Information for Development Evaluating an Information Project Getting.
1 of 19 Evaluating an Information Project Defining Content © FAO 2005 IMARK Investing in Information for Development Evaluating an Information Project.
1 of 19 Organization and Management New Approaches to motivating Staff IMARK Investing in Information for Development Organization and Management New Approaches.
1 of 15 Information Access Internal Information © FAO 2005 IMARK Investing in Information for Development Information Access Internal Information.
1 of 16 Evaluating an Information Project Building Consensus © FAO 2005 IMARK Investing in Information for Development Evaluating an Information Project.
1 of 19 Organization and Management New Structures and Alliances IMARK Investing in Information for Development Organization and Management New Structures.
No 1 IT Governance – how to get the right and secured IT services Bjorn Undall and Bengt E W Andersson The Swedish National Audit Office Oman
1 Instruments and Data Collection New Mexico AmeriCorps April 20, 2006 Sue Hyatt, Project STAR Coach.
Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Human Performance Improvement Process
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Illinois Department of Children and Family Services, Pathways to Strengthening and Supporting Families Program April 15, 2010 Division of Service Support,
0 - 0.
ALGEBRAIC EXPRESSIONS
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
SUBTRACTING INTEGERS 1. CHANGE THE SUBTRACTION SIGN TO ADDITION
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
Addition Facts
Chapter 2 Formulating and clarifying the research topic
Chapter 6 Negotiating access and research ethics
SADC Course in Statistics Producing a product portfolio Module I3 Session
Supporting managers: assessment and the learner journey
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
Child Care Subsidy Data and Measurement Challenges 1 Study of the Effects of Enhanced Subsidy Eligibility Policies In Illinois Data Collection and Measurement.
1 Implementing Internet Web Sites in Counseling and Career Development James P. Sampson, Jr. Florida State University Copyright 2003 by James P. Sampson,
Generalized Model for Program Planning
Fact-finding Techniques Transparencies
Credit Risk Plus November 15, 2010 By: A V Vedpuriswar.
Chapter 13 Overall Audit Plan and Audit Program
Effective Test Planning: Scope, Estimates, and Schedule Presented By: Shaun Bradshaw
0 Solving Problems in Groups ©2008, University of Vermont and PACER Center Solving Problems in Groups PCL Module 9.
Board Evaluation 4 Series – October 2013 Paul Clarke Nonprofit Resource Center Allen County Public Library.
Marketing Research and Information Systems
Leadership ®. T EAM STEPPS 05.2 Mod Page 2 Leadership ® 2 Objectives Describe different types of team leaders Describe roles and responsibilities.
The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management.
Developing a Global Vision Through Marketing Research
Appraising and Managing Performance (c) 2007 by Prentice Hall7-1 Chapter 7.
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Addition 1’s to 20.
25 seconds left…...
GETTING IT RIGHT. Today we will - review or refresh your approach to making funding bids by: Not sure? Ask! And this is to remind me and you please to.
Test B, 100 Subtraction Facts
Week 1.
Screen 1 of 20 Reporting Food Security Information Reporting for Results Learning Objectives At the end of this lesson you will be able to: understand.
Chapter Thirteen Fieldwork 13-1 © 2007 Prentice Hall.
Abuse Prevention and Response Protocol.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Business Project Nicos Rodosthenous PhD 08/10/2013 1
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Presentation transcript:

1 of 20 Evaluating an Information Project From Questions to Results © FAO 2005 IMARK Investing in Information for Development Evaluating an Information Project From Questions to Results

2 of 20 Evaluating an Information Project From Questions to Results Learning Objectives At the end of this lesson you should be able to: identify the specific impacts that you want to measure; be aware of the assumptions involved in measuring change; define the most appropriate indicators for your selected impacts; be aware of several methods for collecting data.

3 of 20 Evaluating an Information Project From Questions to Results Introduction In order to examine the issues involved in the planning of an evaluation, we are following a scenario: Dr Kumar, the Director of publications at SMAU University, must evaluate a newsletter. Dr Kumar and his colleagues have held the first Evaluation Management Committee meeting, to reach an agreement on the focus of the evaluation. The EMC conducted a preliminary analysis of the RICE NEWS project, examining Needs, Beneficiaries, Activities, Outcomes. Also, a SWOT analysis identified potential risks and benefits of the evaluation. Dr Kumar, Jon and Lara have built a Logic Model, that has provided an intellectual basis for the evaluation. Each of the "if-then" links identified has helped define what to measure.

4 of 20 Evaluating an Information Project From Questions to Results Introduction Now, here are two important sets of questions: If any given impact contains the word "more" - for example, "more awareness", "more knowledge", or "more skill" - how can you best measure such a change? Are all potential impacts of your project equally important? Do you want to try to measure all of them?

5 of 20 Evaluating an Information Project From Questions to Results Setting priorities Lets start with the first question. Which impacts are most important to your stakeholders? A priority setting process will help you avoid two common mistakes: 2. Collecting too much data 1. Taking the easy way out

6 of 20 Evaluating an Information Project From Questions to Results Setting priorities Here is a useful method to avoid these mistakes: 1.Call another EMC meeting. 2.Use the card technique. 3.Ask each member of the EMC to write down specific questions that (s)he thinks the evaluation should focus on (one question per card).

7 of 20 Evaluating an Information Project From Questions to Results Setting priorities 3.Gather up the cards, pin or tape them to a large board and lead a discussion in which members try to group them. 4.Raise four supplementary issues (for each of the questions on the cards): Who is the information for? How will this intended user actually use it? How will you collect it? What resources will you need to collect it?

8 of 20 Evaluating an Information Project From Questions to Results Measuring change If any given impact contains the word "more" – for example, "more knowledge"– how can you best measure such a change? Ideally, we need both before and after data if the evaluation is going to measure change. But what if your information project lacks this "before" data? Now lets take the second set of questions:

9 of 20 Evaluating an Information Project From Questions to Results Measuring change The best approach is to look at any question that contains the word "more: How do we measure the change that that word implies? Do we really have questions (indicators) to measure such a change? Are our questions (indicators) valid, reliable and convincing?

10 of 20 Evaluating an Information Project From Questions to Results Measuring change The biggest challenge in deciding on questions (data) is to balance what you would like to collect with what you can collect. Here are four "rules" regarding question selection. Make sure that: you've done a systematic analysis of all your questions; you know what types of data they're likely to generate; you're aware of any potential limitations in these data; and you've already thought about how they should be interpreted.

11 of 20 Evaluating an Information Project From Questions to Results Collecting data Let's go back to MSAU. The EMC has used the "card technique" to come to agreement on questions to be asked and data to be collected. Now the issue is HOW to collect these data. Where are these data? How can we best collect them? What resources do we need?

12 of 20 Evaluating an Information Project From Questions to Results Collecting data WHERE ARE THE DATA? 1. In project records 2. In peoples attitudes and behaviours Content determines location, location determines method, and method determines resources.

13 of 20 Evaluating an Information Project From Questions to Results Collecting data Lets start with project records. Some of the data that could be found in project records are: data on funds spent; number of person-days gone into the project; number of copies produced; number of requests for hard copies; and number of hits on the web site.

14 of 20 Evaluating an Information Project From Questions to Results Collecting data Suppose your EMC has developed a list of questions to find out about farmer knowledge. How are you going to ask these questions?..and the second category: in peoples attitudes and behaviours. The most common ways are questionnaires and face-to-face interviews. In both cases, it is possible to use both open-ended questions or/and fixed-response questions.

15 of 20 Evaluating an Information Project From Questions to Results Collecting data Interviews or questionnaires? Here are some issues to consider: But do we have the organizational capacity to analyze these data once we have collected them? 1. We will probably get a higher response rate from interviews. But how high a response rate do we really need? 2. Interviews will be more expensive and time-consuming. Will the benefits that we gain from face-to-face contacts outweigh the costs? 3. During interviews, well be able to ask follow-up questions.

16 of 20 Evaluating an Information Project From Questions to Results Collecting data Whichever technique you use, youll have to decide about respondents. Whom should you interview? To whom should you send questionnaires? What proportion of stakeholders should you interview or send questionnaires to?

17 of 20 Evaluating an Information Project From Questions to Results Analyzing results After we collect our data, the next step will be to analyze them. I identified three last stages in our evaluation: From data collection to organization From organization to analysis and conclusions From analysis to a report and publication

18 of 20 Evaluating an Information Project From Questions to Results Analyzing results Each of these stages represents a process, that: must be planned; requires resources; must have clearly defined responsibilities and procedures; and produces outputs.

19 of 20 Evaluating an Information Project From Questions to Results Analyzing results If I had to do it all over, I would think about evaluation right at the beginning of the project, and define stakeholders, impacts and evaluation questions right from the start. I think that an evaluation is really a project in itself. It has to be planned, managed, and it could even be evaluated.

20 of 20 Evaluating an Information Project From Questions to Results Summary When planning an evaluation, important issues are: setting priorities and measuring change. It is important to: a) conduct a systematic analysis of all your questions; b) know what types of data they're likely to generate; c) be aware of any potential limitations in these data; and d) think about how they should be interpreted. The next issue will be how to collect data. The questions are: Where are these data? How are we going to collect them? What resources will we need? And the final issue will be how to analyze the data. The analysis is conducted in three stages: a) from data collection to organization; b) from organization to analysis and conclusions; and c) from data analysis to a report and publication.