Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Policy evaluation and empirical.

Slides:



Advertisements
Similar presentations
Theory-Based Evaluation:
Advertisements

1 Evaluating Communication Plans Cvetina Yocheva Evaluation Unit DG REGIO 02/12/2009.
GESTION DES MILIEUX ET BIODIVERSITE From local production of data to global relevant assessments Example of the french approach Copenhagen, 30 October.
© 2009 Berman Group. Evidence-based evaluation RNDr. Jan Vozáb, PhD partner, principal consultant Berman Group.
Postgraduate Course 7. Evidence-based management: Research designs.
Introduction to the unit and mixed methods approaches to research Kerry Hood.
Towards More Sustainable and Market-based Payment for Ecosystem Services A Pilot Project in Lijiang, China Lu Zhi.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Biodiversity/HNV indicators and the CAP Zélie Peppiette Rural Development Evaluation Manager DG AGRI, European Commission UK seminar on HNV farming policy,
Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Policy evaluation and empirical.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
8. Evidence-based management Step 3: Critical appraisal of studies
The French Youth Experimentation Fund (Fonds d’Expérimentation pour la Jeunesse – FEJ) Mathieu Valdenaire (DJEPVA - FEJ) International Workshop “Evidence-based.
Introduction to Research Methodology
Introduction to Research
PPA 502 – Program Evaluation
Empirical validity of the evaluation of public policies: models of evaluation and quality of evidence. Marielle BERRIET-SOLLIEC 1, Pierre LABARTHE 2*,
Research problem, Purpose, question
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
Introduction to evidence based medicine
Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. An example of the practical implications.
PABLO RODRÍGUEZ-BILELLA PABLO RODRÍGUEZ-BILELLA Executive Committee Member Evaluation Networks and Governance: The case of the ReLAC (Latin American Network.
Network of Networks on Impact Evaluation Impact evaluation design for: PADYP Benin Jocelyne Delarue - AFD NONIE design clinic 1 Cairo April, Original.
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Disciplinary boundaries and heterogeneity of sciences Catherine Laurent ( UWC 5-6 november 2007)
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
1 Hsin Chu, August 2012 Regulatory Impact Assessment Charles-Henri Montin, Senior Regulatory Expert, Ministry of economy and finance, Paris
Quality school library – how do we find out? Polona Vilar Department of LIS&BS, Faculty of Arts, University of Ljubljana, SLO Ivanka Stričević Department.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Introduction to Research
Numerous common gaps… … more or less difficult to fill. Environmental Sciences and biodiversity conservation policies Rio Seminar. August 28, 2008.
EBP methodology aims at specifying - Which scientific evidences are actually used for policy decision (surveys) (Cf WP1) - Which scientific evidences are.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
Critical Appraisal of the Scientific Literature
Research and survey methods Introduction to Research Islamic University College of Nursing.
Evidence-Based Medicine: What does it really mean? Sports Medicine Rounds November 7, 2007.
EBP-BIOSOC Juin 2007 Biosoc general framework Agenda of the meeting and expected results Practical information Introduction of WP1 discussion.
IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL EDUCATION IN ALBANIA IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL.
Conducting and Reading Research in Health and Human Performance.
The Discussion Section. 2 Overall Purpose : To interpret your results and justify your interpretation The Discussion.
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
Systematic Review: Interpreting Results and Identifying Gaps October 17, 2012.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Research for Nurses: Methods and Interpretation Chapter 1 What is research? What is nursing research? What are the goals of Nursing research?
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Design of Clinical Research Studies ASAP Session by: Robert McCarter, ScD Dir. Biostatistics and Informatics, CNMC
Evidence-Based Practice
Lecture №4 METHODS OF RESEARCH. Method (Greek. methodos) - way of knowledge, the study of natural phenomena and social life. It is also a set of methods.
Research Methodology II Term review. Theoretical framework  What is meant by a theory? It is a set of interrelated constructs, definitions and propositions.
ABRA Week 3 research design, methods… SS. Research Design and Method.
Research refers to a search for knowledge Research means a scientific and systematic search for pertinent information on a specific topic In fact, research.
Research Design Quantitative. Quantitative Research Design Quantitative Research is the cornerstone of evidence-based practice It provides the knowledge.
EVIDENCE BASED PRACTICE ATHANASIA KOSTOPOULOU ERASMUS IPs
Copyright © 2010, 2006, 2002 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 10 Evidence-Based Practice Sharon E. Lock.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Deciphering “Evidence” in the New Era of Education Research Standards Ben Clarke, Ph.D. Research Associate - Center for Teaching and Learning, University.
1 Copyright © 2012 by Mosby, an imprint of Elsevier Inc. Copyright © 2008 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 15 Evidence-Based Practice.
IEc INDUSTRIAL ECONOMICS, INCORPORATED Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches Cynthia Manson, Principal June 23,
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
DUET.
Evidence Based Policing Knowledge-Oriented Approaches
Evaluation plans for programming period in Poland
Evidence-Based Practice
Research Design Quantitative.
Evidence-Based Public Health
Presentation transcript:

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Policy evaluation and empirical evidence : Lessons from the EBP methods. Rio Seminar, August, 29 th, Marielle BERRIET-SOLLIEC 1, Jacques BAUDRY 2, Pierre LABARTHE 3 1 ENESAD, UMR CESAER, Dijon (France) 2 INRA SAD Paysage, Rennes (France) 3 INRA SAD-APT, Paris (France)

Outlines of the presentation 1. The diversity of methods of policy evaluation and the problem of validity Evaluation models based on the program theory Three methods for the analysis of causal relations The lack of confrontation to field evidence Two examples: agricultural extension and agri-environment 2. EBP method and the evaluation of public policies Systematic reviews of scientific literature and the hierarchy of levels of evidence Quantitative methods to assess impacts 3. Implementation of quantitative/qualitative methods of evaluation The necessity of dedicated organizations A limit: lack of data for the test of hypothesis derived from theories about the schemes of causality induced by public policies

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Section 1. The diversity of methods of policy evaluation and the problem of empirical validity. Rio Seminar, August, 29 th, 2008.

The diversity of methods of policy evaluation and the problem of validity.  The evaluation model embedded in the program theory Definition (Hansen 2005) The difficulty of the identification of causal relations. Ex: agricultural extension (Labarthe 2006), agri-environment  Three methods for the description of schemes of causal relations Policy analysis : analysis of stakeholders networks and relations, and of the hierarchy of objectives / public policies (Berriet-Solliec 2007) Micro-economic scientific analysis of the relations between variables through modeling methodologies (Heckmman and al. 1999) Experimental settings (comparison between groups with public support and control group) conceived at the beginning of policy implementation

Illustration in the case of agricultural extension  Policy Analysis Ex: application of the Soft System Methodology method for designing public project of extension education (Navarro and al. 2008). Problem of the choice of stakeholders? No measurements of the impacts of the extension project  Micro-economic analysis Standard modeling aimed at determining the financial conditions of equilibrium between demand and supply (Dinar 1996, Dinar and Keynan 2001) Very low level of empirical content  Experimental methods Quasi-experimental methods (comparison of the productivity of a group of farmers with public extension, and a control group without extension) Very few examples of application (Davis and Nkonya 2008)

Illustration in the case of agri-environment 1. Agri-environment for biodiversity measures are based on obligation of means, not results Farmers adopt a measure on a voluntary basis, choose the field(s) where it will be applied They must have certain practices, they can be the ones they already have 2. The measures are short term, there is no monitoring the monitoring should include a pre-measure assessment of the fluctuations of the populations of the target species when only the beginning and the final year are monitored, it is difficult to assess the efficiency of the measure 3. The measures are based on field scale even an increase of a population in a field does not insure its global sustainability the area of land under the measure is not an ecological criteria

A major limit: a lack of confrontation to empirical evidence  The lack of field evidence Lack of data allowing to test hypothesis derived from theories about the schemes of causal relations of policies (Laurent 2007)   raises the question of the empirical content of evaluation methods : how are the hypothesis about causal relation tested and validated ? How is it possible to conciliate the building of theoretical models and the accumulation of the knowledge and observations from the fields? (Lawson 2003)  Evidence Based Policy (EBP) methods as a possible solution ?

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Section 2. The EBP Method Rio Seminar, August, 29 th, 2008.

EBP method: hierarchy of evidence, a key step  Classification of evaluation methods / quality in terms of empirical validity of the evidence produced 1. Opinion of respected authorities, based on clinical experience, descriptive studies or reports of expert committees. 2. Evidence from historical comparisons. Evidence from cohort or case- control analytical studies. 3. Evidence from well-designed controlled trials without randomization. 4. Evidence obtained from at least one properly randomized controlled trials. Level of empirical validity

EBP method: quantitative methods  ex-Post evaluation: test of hypothesis about causal relation of an implemented policy through quantitative methods of impact measurement (Zahm and al. 2008) Impact = Difference between the [situation with public support] and the [situation without public support] Two solutions : time comparison, or control comparison Difficulties: access to data, bias of selection  Main methodologies (Schmitt and al. 2008) Estimation on panel data Matching methods  Combining quantitative and qualitative approaches quantitative method: robustness of hypothesis / causal relation But no explanations about the mechanisms  need for combining them with qualitative approaches

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Section 3. Implementation: requirements and difficulties Rio Seminar, August, 29 th, 2008.

An illustration of a policy evaluation combining quantitative and qualitative methods. The case of agricultural extension  Very few examples. The IFRI project (Birner and al. 2006) Designing a systemic framework for the institutional analysis of the causal relations of the diverse reforms of agricultural extension services in different contexts: “from best practices to best fit” Using quantitative experimental methods for the evaluation of impacts  Some difficulties: diversity of contexts of applications of policies, lack of data  a need to perform field experiments and investigations lack of systematic analyses of the scientific literature about agricultural extension reforms.   a need for dedicated organizations for EBP and quantitative/qualitative methods of public policy evaluation?

EBP implementation: the need for dedicated organisations  An example: the Social Science Research Unit (SSRU) of London University (Olliver and al. 2005) collaboration with the EPPI-center proposes new models for systematic review of the scientific literature about researches combining quantitative and qualitative data  Goals of the construction of scientific state of the art: producing information for policymakers about their regulations setting up a framework for the evaluation of these regulations  Originality: distinction in the scientific literature between stakeholders’ opinions and impact studies combining new methodologies for evaluation and a theoretical analysis

EBP implementation: two difficulties. lack of data and theoretisation.  Limits of EBP approach (Cartwright 2007, Kirsch and Laurent 2007) The definition and status of “evidence” vary considerably among the numerous evaluations inspired by the EBP approach Lack of precision in the category of evidence used: does it really allow to test an hypothesis derived from a scientific theory about the causal relations induced by a public policy?  A major problem: data availability / high level of requirements a need for two groups: public supported group + control group a need for two types of variables: descriptive variable of each individual + indicators of performance / goals of the public policy a need for time-series and panel data  Yesterday’s Allsopp, Baudry & Burel presentation dealt with these questions

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Discussion. Rio Seminar, August, 29 th, 2008.

Discussion  EBP : a way to take into account empirical findings and empirical validity... but rarely used in ex ante evaluation in France  In ex post evaluation, a necessity to build indicators linked to potential effects (outcomes and impacts)  Evidence based agricultural policy does not exist yet... but its development needs political and financial specific supports creation of an Evidence Policy Center dedicated to agriculture and sustainable development  A link to critical realism (Lawson, 2003) and to scientific methods which pay attention to reality