Indicators Dr Murali Krishna Public Health Foundation of India.

Slides:



Advertisements
Similar presentations
Gender Audit. Traditional use of audit relates to accounting: Analysis of gender budget Gender audit still evolving… -now used interchangeably with evaluation.
Advertisements

Results Based Monitoring (RBM)
Introduction to Impact Assessment
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Standardized Scales.
From Research to Advocacy
M & E for K to 12 BEP in Schools
Donald T. Simeon Caribbean Health Research Council
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Improve your Club’s project design and implementation Presented by: Ulrike Neubert SIE PD Ann-Christine Soderlund SIE APD.
Planning an improved prevention response up to early childhood Ms. Giovanna Campello UNODC Prevention, Treatment and Rehabilitation Section.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
Informal Academic Diagnostic Assessment: Using Data to Guide Intensive Instruction Part 2: Reviewing Graphed Data 1.
Objectives and Indicators for MCH Programs MCH in Developing Countries January 25, 2011.
MAP-IT: A Model for Implementing Healthy People 2020
Measures Definition Workshop
+ Monitoring, Learning & Evaluation Questions or problems during the webinar?
How to Write Goals and Objectives
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Principal Performance Evaluation System
Objectives and Indicators for MCH Programs
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Indicator Baseline Target Milestones PERFORMANCE MEASUREMENTS.
Developing the Logical Frame Work …………….
Evaluation Test Justin K. Reeve EDTECH Dr. Ross Perkins.
Ensuring Inclusion Defining concepts and Identifying Indicators.
The County Health Rankings & Roadmaps Take Action Cycle.
Performance Measurement and Analysis for Health Organizations
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Unit 10. Monitoring and evaluation
Developing Indicators
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
The National Development Plan, Iraq 6 July 2010 “Developing Objectives & Indicators for Strategic Planning” Khaled Ehsan and Helen Olafsdottir UNDP Iraq.
1 Data analysis. 2 Turning data into information.
M&E Basics Miguel Aragon Lopez, MD, MPH. UNAIDS M&E Senior Adviser 12 th May 2009.
Sub-group "Indicators" Grange November, 2012.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
Measuring and reporting outcomes for BTOP grants: the UW iSchool approach Samantha Becker Research Project Manager U.S. IMPACT Study 1UW iSchool evaluation.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Indicators to Measure Progress and Performance IWRM Training Course for the Mekong July 20-31, 2009.
Workshop SSCAAA Holiday Inn Select, Montréal December 2, 2009 Partnership Agreements Data Analysis for Decision Making.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Monitoring and Evaluation Orientation 17 th May 2011 MCA Namibia.
The P Process Strategic Design
This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Session 13: Monitoring and Evaluation PubH325 Global Social Marketing Donna Sherard, MPH November 30, 2009.
Using Assessments to Monitor and Evaluate Student Progress 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Session 2: Developing a Comprehensive M&E Work Plan.
Development of Gender Sensitive M&E: Tools and Strategies.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
SUB-MODULE 4. PERFORMANCE INDICATORS RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
OGB Partner Advocacy Workshop 18 th & 19 th March 2010 Indicators.
Gathering a credible evidence base
Module 2 Basic Concepts.
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
M&E Basics Miguel Aragon Lopez, MD, MPH
Session 1 – Study Objectives
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Monitoring and Evaluation
MAP-IT: A Model for Implementing Healthy People 2020
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

Indicators Dr Murali Krishna Public Health Foundation of India

Lesson plan Objective: At the end of session every participant should be able to – Define indicators – say the importance of indicators in M & E – Able to choose appropriate indicators Time: 1 hour 30 minutes Mode of lecture: Interactive session Media: Power point presentation Evaluation: in group work

Introduction Essential part of a monitoring and evaluation system as they are what you measure and/or monitor. Through the indicators you can ask and answer questions such as: – Who? – How many? – How often? – How much?

Definition a quantitative or qualitative factor or variable that provides a simple and reliable means to measure achievement or to reflect the changes connected with an intervention. Indicators are compared over time in order to assess change.

E.g How did your class went on?

Characteristics of indicators The desired properties of indicators, also known as variables, will depend on the approach adopted and the nature of the programme or project being evaluated. All indicators have specific characteristics: – Numeric = the values are numbers – Nominal = the values have names (e.g. male and female) – Continuous = the values are infinite or very large – Ordinal/categorical = the values have a known order (e.g. low to high)

Characteristics of good indicator Indicators will vary from one project to another, according to the work and its context, but in general they are often expected to be: – SMART (specific, measurable, attainable, relevant and time- bound) – SPICED (subjective, participatory, interpreted, cross-checked, empowering and diverse) – CREAM ((Schiavo-Campo 1999, p. 85) Clear : Precise and unambiguous, Relevant : Appropriate to the subject at hand, Economic: Available at a reasonable cost, Adequate : Provide a sufficient basis to assess performance Monitorable: Amenable to independent validation

Practicality of indicators include  Measurability: Is the indicator measurable? Is it sufficiently sensitive to an improvement or deterioration in conditions?  Ease and cost of collection: How easy is it to obtain the information required? How costly will this be? Can the community participate? Are some relevant data already collected?  Credibility and validity: Are the indicators easy to understand, or will people end up arguing over what they mean? Do they measure something that is important to communities as well as implementing organizations?  Balance: Do the selected indicators provide a comprehensive view of the key issues?  Potential for influencing change: Will the evidence collected be useful for communities, implementers and decision- makers?

Common Indicator Metrics Counts –Number of providers trained –Number of condoms distributed Calculations: percentages, rates, ratios –% of facilities with trained provider –Maternal mortality ratio, Total fertility rate Index, composite measures –Quality index comprising the sum of scores on six quality outcome indicators –DALY (Disability Adjusted Life Years) Thresholds –Presence, absence –Pre-determined level or standard

Types of indicators – processing of information: Elementary, derived and compound indicators – comparability of information: Specific, generic and core indicators – phases of completion of the programme: Resource, output, result and impact indicators – evaluation criteria: Relevance, efficiency, effectiveness and performance indicators – mode of quantification and use of the information: Monitoring and evaluation indicators

Developing indicators Step 1: Identify the problem situation you are trying to address. The following might be problems: 1. Economic situation (unemployment, low incomes etc) 2. Social situation (housing, health, education etc) 3. Cultural or religious situation (not using traditional languages, low attendance at religious services etc) 4. Political or organizational situation (ineffective local government, faction fighting etc)

Step 2: Develop a vision for how you would like the problem areas to be/look. This will give you impact indicators. a. What will tell you that the vision has been achieved? b.What signs will you see that you can measure that will “prove” that the vision has been achieved? c.For example, if your vision was that the people in your community would be healthy, then you can use health indicators to measure how well you are doing. a.Has the infant mortality rate gone down? b.Do fewer women die during child-birth? c.Has the HIV/AIDS infection rate been reduced? If you can answer “yes” to these questions then progress is being made. Developing indicators

Step 3: Develop a process vision for how you want things to be achieved. This will give you process indicators. Example: you want success to be achieved through community efforts and participation, then your process vision might include things like – community health workers from the community trained and offering a competent service used by all; – community organizes clean-up events on a regular basis, and so on Developing indicators

Step 4: Develop indicators for effectiveness. For example: if you believe that you can increase the secondary school pass rate by upgrading teachers, then you need indicators that show you have been effective in upgrading the teachers e.g. evidence from a survey in the schools, compared with a baseline survey. Developing indicators

Step 5: Develop indicators for your efficiency targets. Here you can set indicators such as: planned workshops are run within the stated timeframe, costs for workshops are kept to a maximum of US$ 2.50 per participant, no more than 160 hours in total of staff time to be spent on organizing a conference; no complaints about conference organization etc. Developing indicators

Common Pitfalls in Indicator Selection Indicators not linked to program activities Poorly defined indicators Indicators that do not currently exist and cannot realistically be collected Process indicators to measure outcomes & impacts Indicators that are not very sensitive to change Too many indicators