Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Policy evaluation and empirical.

Slides:



Advertisements
Similar presentations
Key messages from the e- consultation in Europe Olivier Chartier (EUROQUALITY) 26 September 2009.
Advertisements

1 Local agri-food networks and environmental effects in Brittany Brussels workshop 8 June 2010 Fédération Régionale des Centres dInitiatives pour Valoriser.
Methodologies for Assessing Social and Economic Performance in JESSICA Operations Gianni Carbonaro EIB - JESSICA and Investment Funds JESSICA Networking.
© 2009 Berman Group. Evidence-based evaluation RNDr. Jan Vozáb, PhD partner, principal consultant Berman Group.
Introduction to the unit and mixed methods approaches to research Kerry Hood.
RESEARCH CLINIC SESSION 1 Committed Officials Pursuing Excellence in Research 27 June 2013.
By Peter Bacon and Associates Sharon Finegan MSc Economic Policy Studies 2 nd December 2011.
Introduction to Research Methodology
Reviewing and Critiquing Research
Good Research Questions. A paradigm consists of – a set of fundamental theoretical assumptions that the members of the scientific community accept as.
Quality evaluation and improvement for Internal Audit
Empirical validity of the evaluation of public policies: models of evaluation and quality of evidence. Marielle BERRIET-SOLLIEC 1, Pierre LABARTHE 2*,
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
Introduction to evidence based medicine
Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. An example of the practical implications.
Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Policy evaluation and empirical.
The Knowledge Resources Guide The SUVOT Project Sustainable and Vocational Tourism Rimini, 20 October 2005.
Dr Amanda Perry Centre for Criminal Justice Economics and Psychology, University of York.
The implementation of the rural development policy and its impacts on innovation and modernisation of rural economy Christian Vincentini, European Commission.
RSBM Business School Research in the real world: the users dilemma Dr Gill Green.
“”Capacity and services to road users” Task descriptions Paul van der Kroon, Paris November 2005.
Society: the Basics Chapter 1.
Performance Measurement an Management of PACA and LED: The Compass of Local Competitiveness Jörg Meyer-Stamer
ENSURING FOOD SECURITY IN SUB-SAHARAN AFRICA A WAY THROUGH World Farmers Organization Rome 7 th June 2012 Martin Eweg African Forum for Agricultural Advisory.
Parliamentary Committees in Democracies: Unit 4 Research Services for Parliamentary Committees.
RESEARCH A systematic quest for undiscovered truth A way of thinking
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Disciplinary boundaries and heterogeneity of sciences Catherine Laurent ( UWC 5-6 november 2007)
Public Value Innovation and Research Evaluation Discussion by Karen Macours INRA - Paris School of Economics.
P. W. Stone M6728 Columbia University, School of Nursing Evaluating the Evidence.
RESEARCH IN MATH EDUCATION-3
Mali Work Packages. Crop Fields Gardens Livestock People Trees Farm 1 Farm 2 Farm 3 Fallow Pasture/forest Market Water sources Policy Landscape/Watershed.
Systematic Reviews.
ESPON Seminar 15 November 2006 in Espoo, Finland Review of the ESPON 2006 and lessons learned for the ESPON 2013 Programme Thiemo W. Eser, ESPON Managing.
Discussion From Republic of Science to Audit Society, Irwin Feller S. Charlot ASIRPA, Paris, France; June 13, 2012.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
1 st INFASA Symposium and Workshop Synthesis March 16 and 17, 2006 Bern, Switzerland As presented at the Symposium and Workshop by Dr. Fritz Häni, SHL.
Numerous common gaps… … more or less difficult to fill. Environmental Sciences and biodiversity conservation policies Rio Seminar. August 28, 2008.
Climate change in the European Commission’s Impact Assessment. An evaluation of selected impact assessment reports Valentine van Gameren, Centre d’Etudes.
EBP methodology aims at specifying - Which scientific evidences are actually used for policy decision (surveys) (Cf WP1) - Which scientific evidences are.
Secondary Translation: Completing the process to Improving Health Daniel E. Ford, MD, MPH Vice Dean Johns Hopkins School of Medicine Introduction to Clinical.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
1 1 The Global Project on Measuring the Progress of Societies OECD World Forum on Statistics, Knowledge and Policy Jon Hall, World Forum Project Leader,
EBP-BIOSOC Juin 2007 Biosoc general framework Agenda of the meeting and expected results Practical information Introduction of WP1 discussion.
IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL EDUCATION IN ALBANIA IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL.
Research Design. Selecting the Appropriate Research Design A research design is basically a plan or strategy for conducting one’s research. It serves.
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
Evidence Based Practice Alice Knott, RN November 11, 2008.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
ESPON Workshop at the Open Days 2012 “Creating Results informed by Territorial Evidence” Brussels, 10 October 2012 Introduction to ESPON Piera Petruzzi,
Design of Clinical Research Studies ASAP Session by: Robert McCarter, ScD Dir. Biostatistics and Informatics, CNMC
Framework and assessment methodology for policy coherence for development: Draft Report for OECD 16 th June, Paris Nick Bozeat.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Introduction to the discussion of the presentation by Y. L’Horty ASIRPA International WS 13 June 2012 ASIRPA Socio-economic analysis of the diversity of.
ABRA Week 3 research design, methods… SS. Research Design and Method.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
EVIDENCE BASED PRACTICE ATHANASIA KOSTOPOULOU ERASMUS IPs
Copyright © 2010, 2006, 2002 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 10 Evidence-Based Practice Sharon E. Lock.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Monitoring and Evaluating Rural Advisory Services
Demand Estimation and Forecasting
Technical Assistance on Evaluating SDGs: Leave No One Behind
Monitoring and Evaluating Rural Advisory Services
Lesson 1 Foundations of measurement in Psychology
Evidence-Based Practice
Natural water Retention Measures
MGT601 SME MANAGEMENT.
Regulatory Perspective of the Use of EHRs in RCTs
Environmental forecasting
Presentation transcript:

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Policy evaluation and empirical evidence : lessons from the EBP debates. Paris Seminar, June, 5 th, Marielle BERRIET-SOLLIEC Pierre LABARTHE Jacques BAUDRY Catherine LAURENT

 “Boom”of policy evaluations: tool for their modernization  Emergence of typologies of public policies evaluations Hansen (2005)Six types: oEconomic models (1. cost/benefit; 2. cost/effectiveness; 3. cost/advantages) o4. Results models o5. Explanatory process models o6. Programme theory models Perret (2001):Five evaluation paradigms : o1. Accountability; o2. Social value; o3. Utilization focused evaluation; o4. Theory driven evaluation; o5. Experimental methods  These typologies are rather complex constructions that mix considerations about goals, theories and methods…  … but pay little attention to empirical validity Introduction (1/2)

 Aim of our paper: Starting point: A simple “typology” of evaluation procedures / their aim: (learning / measuring / understanding) Main idea: To use the outcomes of the EBD debates to discuss about the empirical validation of these procedures oTypes of evidences oLevels of evidences Two case studies: oAgricultural extension oBiodiversity policies Introduction (2/2)

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Section 1. Rethinking typologies of procedures for the evaluation of public policies according to their aim. Paris Seminar, June, 5 th, 2009.

Different aims for evaluating public policies  Learning Aim: Evaluation as an instrument to learn Methods:Large forum or participative approaches Ex (extension): Soft system methodology for extension pg. Ex (biodiversity): SMA.  Measuring Aim:Assessing the specific effect of a public policy Methods:Quantitative experimental settings / econometrics Ex (extension): Matching for farmers’ field school assessment (Kenya) Ex (biodiversity): Matching for “prime à l’herbe” assessment  Understanding Aim:Identifying the causality scheme of a policy Methods:Very diverse (from pure theory to monography) Ex (extension): Integrating extension in agricultural production fonction Ex (biodiversity): UK agri-env schemes on birds

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Section 2. EBP and the debates about the empirical evidences Paris Seminar, June, 5 th, 2009.

Different types of evidences  Evidences of existence Definition: verifying the facts at field level Ex (extension): accountability (investments and field operations) Ex (biodiversity): faunistic and floristic inventories  Evidences of effectiveness Definition: measuring the specific effects of a policy on its goal Ex (extension): yield growth of farms benefiting from extension Ex (biodiversity): positive correlation between a policy and observable level of biodiversity  Evidences of causality Definition: identifying the relations of cause and effects of a policy Ex (extension): services  new knowledge  new practices  effects Ex (biodiversity): policy  landscape mosaic  reproduction conditions  biodiversity  Evidences of harmlessness Definition: identifying the relations of cause and effects of a policy Ex (extension): effects of extension programmes on income inequalities between farmers Ex (biodiversity): effects of biodiversity on income inequalities between farmers

EBP method: hierarchy of evidence, a key step  Classification of evaluation methods / quality in terms of empirical validity of the evidence produced 1. Opinion of respected authorities, based on clinical experience, descriptive studies or reports of expert committees. 2. Evidence from historical comparisons. Evidence from cohort or case- control analytical studies. 3. Evidence from well-designed controlled trials without randomization. 4. Evidence obtained from at least one properly randomized controlled trials. Level of empirical validity

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Section 3. Revisiting evaluation methods For extension or biodiversity programmes Paris Seminar, June, 5 th, 2009.

Better understanding the trade-of between “understanding” and “measuring” (1/2)  Case study 1: Does extension commercialization work?  A policy based on standard economic assumptions: “efficiency and demand-driven extension” (Baxter, Carney)  Most of evaluation are based on standard modeling that compare public and private efficiency “ad-hoc” methodologies and validation that do not fit with the EBD recommendation about evidences of effectiveness Some strong assumptions that are not discussed in the line of their empirical validity (individual level without interactions between farmers, short-term analyses, no back-office consideration)… … even though some other approaches provide highlights about the failures that could derives from such assumptions (asymmetry of information, innovation networks, technological lock-in, etc.)

Better understanding the trade-of between “understanding” and “measuring” (2/2)  Case study 2: biodiversity  Within field management (undrilled patches) has a positive effect on birds (skylarks)  An analysis of chicks diet shows the importance of invertebrates (measuring by correlation)  These invertebrates thrive in undrilled patches covered by weeds (understanding)

Limits in the use of scientific knowledge in evaluations aimed at “learning”  Case study 1: extension & soft system methodology  Aim: providing communication techniques (soft system) to support collective and consensual decisions (Checkland 1981)  Scientific knowledge, and the production of evidences is not pictured as central to these methods, BUT Scientific knowledge is used by stakeholders The control of the empirical validity of the scientific content of the debates is a key problem of these evaluation methodologies (Salner 2000) These methods can be distorted and can lead to strong asymmetries between stakeholders when there are conflicts between them (Jackson 1991)

Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Discussion Paris Seminar, June, 5 th, 2009.

A new perspective on methods for evaluating public policies  Discussing the trade-of between methods For effects evaluating, a need for evidences of effectiveness For cause/effects understanding, a need for evidences of causality A major trade-of: oFor evidences of effectiveness  randomization (RCT) oFor evidences of causality  a need for ad-hoc experimental settings Potential failure Transposition failure (even though evidences of effectiveness exist) Etc.  How to overcome this difficulty? Possible failures of participative approaches? A need for collective organizations and new theories of the contextualisation and synthesis of scientific knowledge fitting with the requirements of decision in practice (Cartwright 2007)