Carol J. Pierce Colfer Center for International Forestry Research & Cornell International Institute for Food, Agriculture and Development.

Slides:



Advertisements
Similar presentations
Division Of Early Warning And Assessment MODULE 9: LINKING INTEGRATED ENVIRONMENTAL ASSESSMENT TO POLICY.
Advertisements

Quality assurance Kari Kuulasmaa 1 st EHES Training Seminar, February 2010, Rome.
1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Demanding Questions and Difficult Answers Alan Maddocks Carol Newbold Loughborough University.
The Academic Infrastructure and IQER Wendy Stubbs Assistant Director
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
Capacity Building Global Support Program Enhance the institutional capacity necessary to support professionals in implementing tiger conservation over.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Lessons to action. MASKS of Introductions and Expectations Check OUTSIDEINSIDE 3 WORDS that describe how other people see you. 2 WORDS that describe your.
Learning Outcomes By Terrence Willett. What are Learning Outcomes? n Assessment / Program Based n Outcome Based Assessment n Skills; Knowledge; Result.
Reflection & Planning for Future Progress CVUHSD Academy Retreat 2011.
Donald T. Simeon Caribbean Health Research Council
The SEDA Teacher Accreditation Scheme James Wisdom Visiting Professor in Educational Development, Middlesex University
AN INTRODUCTION TO SPHERE AND THE EMERGENCY CONTEXT
Screen 1 of 43 Reporting Food Security Information Reporting Formats for Food Security Report Types Learning Objectives At the end of this lesson you will.
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Developing the Research Question: From Interest to Science Samuel R. Mathews, PhD. The University of West Florida Pensacola, Florida, USA and Visiting.
Orientation for New Site Visitors CIDA’s Mission, Value, and the Guiding Principles of Peer Review.
Project Monitoring Evaluation and Assessment
The quality assurance system in Sweden Håkan Hult Linköping University Gdansk March 13, 2009.
RISE – Regional Integrated Strategies in Europe Stakeholders: Birmingham City Council (Lead Stakeholder); Regional Council of Västerbotten; Region Zealand;
The Academic Assessment Process
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
INTRODUCTION.- PROGRAM EVALUATION
P ARTICIPATORY A CTION R ESEARCH Involving Constituents in Social Change Oriented Research.
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Program Evaluation Using qualitative & qualitative methods.
Making the Most of Learning and Assessment, in Wales.
What research is Noun: The systematic investigation into and study of materials and sources in order to establish facts and reach new conclusions. Verb:
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
If you don’t know where you’re going, any road will take you there.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessment on the implementation of the Mediterranean Strategy for Sustainable Development Dr Nicola Cantore Overseas Development Institute,
Developing Indicators
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Ways for Improvement of Validity of Qualifications PHARE TVET RO2006/ Training and Advice for Further Development of the TVET.
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
The Impact of Health Coaching
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
Quantitative and Qualitative Approaches
Why Do State and Federal Programs Require a Needs Assessment?
The Relationship of Quality Practices to Child and Family Outcomes A Focus on Functional Child Outcomes Kathi Gillaspy, NECTAC Maryland State Department.
Self Evaluation Document and Programme Specifications (SED) Planning and preparation meeting(s) Use of reference points (Benchmark Statements/Code of Practice)
Close to Nature Forestry and Forest Policy Challenges in Europe Ilpo Tikkanen, European Forest Institute Zvolen, Slovakia October, 2003 Together.
Qualitative research methodology
T Part I: Introduction Students reflect on the ideas in the video, The Danger of A Single Story, by Nigerian novelist Chimamonde Adichie and identify some.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Qualitative Research Intro for Educational Technologists.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Brief Introduction Dr R Vincent: 1 Most Significant Change: using stories to assess impact.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
27/04/2017 Strengthening of the Monitoring and Evaluation system for FTPP/FTTP in FAO /SEC December 2015 FTPP/FTFP Workshop, Bishkek, Kyrgyzstan.
Relevance to Distance Education of the Experiential Learning Model William E. Garner, Rh.D., CRC, LPC 1.
+ Welcome to PAHO/WHO Sustainable Development and Health Toolkit for the UN Global Conference RIO + 20 Welcome to PAHO/WHO Sustainable Development and.
Development of Gender Sensitive M&E: Tools and Strategies.
Farmers Market and Local Food Promotion Program Grant Writing Workshop Developing Your Idea These workshops are funded by the USDA’s Agricultural Marketing.
Module 3: Ensuring Sustainability Session 1. Empowerment of the Community 1 CBDRR Framework Training - Myanmar Red Cross Society.
Community Score Card as a social accountability Approach Methodology and Applications March 2015.
MODULE 9: LINKING INTEGRATED ENVIRONMENTAL ASSESSMENT TO POLICY
TRAINERS AND TRAINING PROCESSES
GUIDELINES Evaluation of National Rural Networks
Strategic Planning for Learning Organizations
TechStambha PMP Certification Training
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
European TRAINING FOUNDATION
Assessment: Measuring the Impact of our Programs on Students
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Carol J. Pierce Colfer Center for International Forestry Research & Cornell International Institute for Food, Agriculture and Development

Why do qualitative M&E? Three Examples of M&E in action: 1. Quilcene, WA, educational research 2. Global comparative research on assessment of sustainable forest management 3. Global comparative action research with local communities Some Dangers and Conclusions

Need for holistic understanding of a situation & how it has changed. An unexpected finding emerged that hadn’t been measured initially. An external, post-facto analysis is demanded, on issues not initially assessed. The team has qualitative, not quantitative skills.

[Typically] involves Participant observation Long term residence in the research context A holistic, inductive, open-ended orientation Varying degrees of independence from project being evaluated

10 US rural schools, given grants to experiment with their local schools External, long term M&E, both qualitative (fieldwork) & quantitative (cross-site) Field researchers lived in communities for ~3 years, documenting what happened and helping with cross- site studies – teasing out what went right, what went wrong.

A central idea has been that C&I can be used to monitor, assess, & even define a subject of interest. Hierarchy of Principles, Criteria, Indicators, Verifiers Ideal ones are SMART (Specific, Measurable, Achievable, Realistic, Timely) BUT some topics are difficult (impossible?) to quantify:

1. Greater self-confidence among women & other marginalized groups 2. Improved knowledge of regulations among groups previously uninvolved 3. Involvement in enforcing sanctions, by a broader spectrum of stakeholders 4. Closer links between communities & outsiders (government officials, industry, projects, academics)

Aim was to develop widely agreed-upon C&I to define SFM, & for use in monitoring & assessing it (initially, in certification of timber). Series of 1-month, interdisciplinary, international field visits to compare & hone existing sets of C&I that would work in each country studied, using a series of filtering steps (described in CIFOR Toolbox No. 1).

A long term, learning-based approach involving (facilitated) community groups identifying shared, future goals Analyzing, planning, & implementing what is needed to reach those goals Monitoring progress & revising plans accordingly Linking productively with relevant external actors

Facilitator/researchers worked with communities in 11+ countries (vertically, horizontally & iteratively) Teams assessed community progress in ways that worked in their contexts: C&I that local people developed repeated ‘reflection’ meetings use of ethnographic observations

Ongoing M&E by community members – are we reaching our community/group goals? M&E by researcher/facilitators---to what degree is ACM actually empowering, enriching people, &/or enhancing their well being or environments? Result: A complicated life for researcher/facilitators; a qualitative, cross-site assessment, examining/comparing site experiences (7 dimension framework)

Getting too complex - e.g., Landscape Mosaic project’s 4 levels of monitoring Producing something so holistic, ‘deep’ and long that no one will ever read it (cf. quantitative baseline surveys so long & complex that data never get entered, let alone analyzed). [as with any method] Being swayed by your own ideological biases, or someone’s (donors’, employers’) desire for evidence of success

Conducting qualitative M&E can be an uphill battle – donors, policymakers, & many researchers prefer quantitative assessments BUT Qualitative M&E can often provide valuable insights, unexpected findings, not available with conventional quantitative approaches. IDEALLY: Combine the two!