Empirical validity of the evaluation of public policies: models of evaluation and quality of evidence. Marielle BERRIET-SOLLIEC 1, Pierre LABARTHE 2*,

Slides:



Advertisements
Similar presentations
Agriregionieuropa A regional analysis of CAP expenditure in Austria Wibke Strahl, Thomas Dax, Gerhard Hovorka Bundesanstalt fuer Bergbauernfragen, Vienna.
Advertisements

1 CASE STUDY RESEARCH An Introduction. 2 WHY CASE STUDY RESEARCH? The case study method is amongst the most flexible of research designs, and is particularly.
How to measure the CMEF R2 Indicator about Gross Value Added in agricultural holdings without reliable accounting data ? A methodological proposal applied.
Alan Ewert, Ph.D. Indiana University.  A systematic evaluation of existing research in a given area for past literature, validity, reliability, methodology,
Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Policy evaluation and empirical.
1 A localised development of organic farming as a response to the problem of water quality : a new challenge Audrey VINCENT ISARA-Lyon Committee of the.
Session 4: Frameworks used in Clinical Settings, Part 2 Janet Myers, PhD, MPH Session 2 ● September 27, 2012.
Agriregionieuropa A CCOUNTING FOR MULTIPLE IMPACTS OF THE C OMMON A GRICULTURAL P OLICIES IN RURAL AREAS : AN ANALYSIS USING A B AYESIAN NETWORKS APPROACH.
Agriregionieuropa A metafrontier approach to measuring technical efficiency The case of UK dairy farms Andrew Barnes*, Cesar Reverado-Giha*, Johannes Sauer+
Agriregionieuropa Farm level impact of rural development policy: a conditional difference in difference matching approach Salvioni C. 1 and Sciulli D.
Agriregionieuropa The “Rural-Sensitive Evaluation Model” for evaluation of local governments’ sensitivity to rural issues in Serbia Milic B. B.1, Bogdanov.
Agriregionieuropa The CAP and the EU budget Do ex-ante data tell the true? Franco Sotte Università Politecnica delle Marche – Ancona (Italy) 122 nd European.
Agriregionieuropa Methodological and practical solutions for the evaluation of the economic impact of RDP in Latvia M.oec. Armands Veveris Latvian University,
Introduction to Research
Agriregionieuropa An empirical analysis of the determinants of the Rural Development policy spending for Human Capital Beatrice Camaioni 1, Valentina Cristiana.
Agriregionieuropa Dynamic adjustments in Dutch greenhouse sector due to environmental regulations Daphne Verreth 1, Grigorios Emvalomatis 1, Frank Bunte.
Agriregionieuropa Assessing the effect of the CAP on farm innovation adoption. An analysis in two French regions Bartolini Fabio 1 ; Latruffe Laure 2,3.
Expanding the Evidence Base for Consumer Policies A project in partnership with: The European Commission DG Environment BIO Intelligence.
TOOLS OF POSITIVE ANALYSIS
Agriregionieuropa Evaluating the CAP Reform as a multiple treatment effect Evidence from Italian farms Roberto Esposti Department of Economics, Università.
Agriregionieuropa Closing session Few final considerations Giovanni Anania University of Calabria (Italy) & Spera 122 nd European Association of Agricultural.
Agriregionieuropa A minimum cross entropy model to generate disaggregated agricultural data at the local level António Xavier 1, Maria de Belém Martins.
Agriregionieuropa Exploring the perspectives of a mixed case study approach for the evaluation of the EU Rural Development Policy Ida Terluin.
Agriregionieuropa Evaluating the Improvement of Quality of Life in Rural Areas Cagliero R., Cristiano S., Pierangeli F., Tarangioli S. Istituto Nazionale.
ARQ part II data management Training pack 2: Monitoring drug abuse for policy and practice.
Formulating the research design
Pluralité des connaissances scientifiques et intervention publique: agriculture, environnement, et développement durable. Policy evaluation and empirical.
 Be familiar with the types of research study designs  Be aware of the advantages, disadvantages, and uses of the various research design types  Recognize.
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Disciplinary boundaries and heterogeneity of sciences Catherine Laurent ( UWC 5-6 november 2007)
Philip Davies How is a Policy Supposed to Work? – Theory of Change Analysis Philip Davies International Initiative for Impact Evaluation.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
–I am a senior researcher at SUM with a doctoral degree ecological botany. The theses was based on an interdisciplinary study in the rainforest of Guatemala.
© 2005 Pearson Education Canada Inc. Chapter 2 Sociological Investigation.
Criteria for Assessing The Feasibility of RCTs. RCTs in Social Science: York September 2006 Today’s Headlines: “Drugs education is not working” “ having.
Discussion From Republic of Science to Audit Society, Irwin Feller S. Charlot ASIRPA, Paris, France; June 13, 2012.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Principles, Practices and Dynamics of Research Management LECTURE-4 Research Design Kazi Nurmohammad Hossainul Haque Senior Lecturer, Civil Service College.
EBP methodology aims at specifying - Which scientific evidences are actually used for policy decision (surveys) (Cf WP1) - Which scientific evidences are.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.
Potential and Pitfalls of Experimental Impact Evaluation: Reflections on the design and implementation of an experimental Payments for Environmental Services.
EBP-BIOSOC Juin 2007 Biosoc general framework Agenda of the meeting and expected results Practical information Introduction of WP1 discussion.
Comparing Political Systems. Why Compare To develop perspective on the mix of constants and variability which characterize the world’s governments and.
IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL EDUCATION IN ALBANIA IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL.
Richard J.T. Klein Stockholm Environment Institute and Centre for Climate Science and Policy Research, Linköping University Strategies for the effective.
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Workshop A. Development of complex interventions Rob Anderson, PCMD Nicky Britten, PCMD.
Introduction to Research. Purpose of Research Evidence-based practice Validate clinical practice through scientific inquiry Scientific rational must exist.
RESEARCH An Overview A tutorial PowerPoint presentation by: Ramesh Adhikari.
EVALUATING u After retrieving the literature, you have to evaluate or critically appraise the evidence for its validity and applicability to your patient.
Feminist Methods of Research
Validity and utility of theoretical tools - does the systematic review process from clinical medicine have a use in conservation? Ioan Fazey & David Lindenmayer.
A. Strategies The general approach taken into an enquiry.
What is Research?. Intro.  Research- “Any honest attempt to study a problem systematically or to add to man’s knowledge of a problem may be regarded.
EVIDENCE-BASED MEDICINE AND PHARMACY 1. Evidence-based medicine 2. Evidence-based pharmacy.
Critical Realist Evaluation
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Writing a sound proposal
Chapter 1: Introduction to Econometrics
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
CASE STUDY RESEARCH An Introduction.
Planning a Learning Unit
Evaluating agricultural value chain programs: How we mix our methods
Positive analysis in public finance
Financial Econometrics Fin. 505
Regulatory Perspective of the Use of EHRs in RCTs
Presentation transcript:

Empirical validity of the evaluation of public policies: models of evaluation and quality of evidence. Marielle BERRIET-SOLLIEC 1, Pierre LABARTHE 2*, Catherine LAURENT 2, and Jacques BAUDRY 3 1 AGROSUP Dijon, UMR CESAER (Dijon, France) 2 INRA, UMR SAD-APT (Paris, France), 3 INRA, UMR SAD-Paysages (Rennes, France) * Corresponding author: 122 nd European Association of Agricultural Economists Seminar Evidence-Based Agricultural and Rural Policy Making Methodological and Empirical Challenges of Policy Evaluation February 17 th – 18 th, 2011, Ancona (Italy) associazioneAlessandroBartola studi e ricerche di economia e di politica agraria Centro Studi Sulle Politiche Economiche, Rurali e Ambientali Università Politecnica delle Marche

Outlines of the presentation  Research question. How to cope with the “Babel tower” of the methods for public policies evaluation? what is the level of empirical validity of the results of evaluation / goals? is it possible to combine these results? How?  Methodology. a simplified typology of evaluation methods. a conceptual framework: types and levels of evidences.  Results. The issues of the empirical validity of evaluation methods depend on the goal of the evaluation: to learn to measure to understand  Discussion. Is it possible to combine various goals of evaluation?

Empirical validity of the evaluation of public policies: models of evaluation and quality of evidence. Analytical framework Types and levels of evidence and goal of evaluation Marielle Berriet-Solliec, Catherine Laurent & Jacques Baudry.

Diversity of evaluation approaches typology of evaluation models  3 evaluation models / goal of the evaluation approaches 1.[Goal 1: To learn] the evaluation is primarily designed as a collective learning process 2.[Goal 2: To measure] the evaluation is designed to assess the impact of a public programme 3.[Goal 3: To understand] the evaluation identifies and analyses the mechanisms by which the programme under evaluation can produce the expected outcomes or not

Types of evidences  Evidences of existence. Demonstration of the existence of a fact o for example: biological inventory lists for the biodiversity  Evidences of causality. Demonstration of the causal relation between two variables o for example: the relation between a landscape mosaic of agricultural fields and the level of population of an insect specie  Evidences of effectiveness. Demonstration of the specific impact of a public action on its goal o for example: the impact of an agri-environmental scheme (grassy strip) on a biodiversity indicator  Evidences of harmlessness. Demonstration of the absence of adverse effects of a public action o for example: the absence of negative effect of an agri-environmental schemes on the survival of certain categories of farms

Level of evidences 4. Evidence obtained from at least one properly randomized controlled trials. 3. Evidence from well-designed controlled trials without randomization. 2. Evidence from historical comparisons. Evidence from cohort or case- control analytic studies. 1. Opinion of respected authorities, based on clinical experience, descriptive studies or reports of expert committees. Hierarchy of evidences / empirical validity Level of evidence  The possibility to use the results of an evaluation depends on the empirical validity of these findings  There is a hierarchy of the level of empirical validity of the evidences produced by a method of evaluation

Empirical validity of the evaluation of public policies: models of evaluation and quality of evidence. Results Marielle Berriet-Solliec, Catherine Laurent & Jacques Baudry.

[To learn] (1/2) General principles and example  Goal. to promote learning through close collaboration between stakeholders to build awareness and consensus, and to encourage new practices  Method. A sociogram of the network of the stakeholders involved in the programme o  nature and intensity of ties between stakeholders In-depth individual interviews with stakeholders o  to gather each person’s point of view and suggestion for improving the programme Working groups and collective discussions about results o  reach consensus  Example. The Soft System methodology (SSM, Checkland and Scholes 1990) applied to public programmes of training for farm advisors (Navarro et al. 2008)

[To learn] (2/2) The empirical evidence issue  The issue of level of evidence is often neglected. secondary to learning objectives, but raise problems ( van der Sluijs et al. 2008, Salner 2000). no direct testing of the reliability of the evidences brought in by various stakeholders  The issue of the competition of evidence. very often pointed out arbitration often based on non-transparent criteria risk of misuse of evidence in cases of conflict or interests between stakeholders (Jackson 1991)   a risk to focus on consensual solutions rather than on the most effective ones?

[To measure] (1/2) General principles and example  Goal. to measure the effectiveness of a given public programme, to identify its specific impact on a proxy representing the goal of this programme. a major issue: tackling the counterfactual problem.  Method. Econometrics gold method: Randomized Control Trials (RCT) with two core hypothesis that raise technical and ethical problems. o  randomization o  SUTVA hypothesis Second bets methods: Double difference, Matching, etc.  Example. The measurement of the impact of farm advisory public programmes on the knowledge of farmers (Godtland et al. 2004) or on the performance of their farms (Davis et al. 2004)

[To measure] (2/2) The empirical evidence issue  Obtaining evidence of a high level of empirical validity is a challenging issue. costly practices. strong methodological specifications with technical and ethical problems.  Some limitations in the scope of the results. it measures only the specific impact in a given context. it does not indicate precisely the mechanisms which rendered a public action effective it does not prove causality   some limitation in the use of the results? cannot be used to extend a programme to other contexts or periods cannot help understanding why a programme fail

[To understand] (1/2) General principles and example  Goal. to understand the mechanisms operating in the programme evaluated. To reveal in a reliable way the causal relations that explain why a programme works or not in a given context.  Method: Realistic evaluation (Pawson and Tilley 1997) o focuses on (i) the object evaluated; (ii) the mechanisms of public action; (iii) the context Program theory (Chen 1990) o putting forth hypotheses on the causality patterns (diagram)  Example. An economic incentive A must cause a change of agricultural practice B which has an ecological impact C

[To understand] (2/2) The empirical evidence issue  Two ways of using evidences of causality. To produce such evidences: To use theory in order to support the construction of the diagram of causality of the public action:  which limitation and which empirical validity of evidences of causality? theoretical models are always partial representations of complex phenomena there are different possible theoretical models for modeling the mechanisms by which a public action operates the observation of the real effects (evidences of effectiveness) cannot be replaced by expected effects (estimated with the measurement of the means actually employed in the public programme)   is it possible to combine different theoretical frameworks ?

Empirical validity of the evaluation of public policies: models of evaluation and quality of evidence. Discussion Marielle Berriet-Solliec, Catherine Laurent & Jacques Baudry.

Discussion  Different goals for evaluation to learn / to measure / to understand.  Different requirements for the use of evidences to measure evidences of effectiveness. to understand evidences of causality.  To build a rigorous framework about evidences enables to avoid misuses of evidences and of the results of the evaluation. to formalize the trade-off between the goals of evaluation and to support the choice of the method that best fits to it. to open a discussion about the possibility to combine different goals or methods of evaluation.

Thank you for your attention Marielle BERRIET-SOLLIEC 1, Pierre LABARTHE 2*, Catherine LAURENT 2, and Jacques BAUDRY 3 1 AGROSUP Dijon, UMR CESAER (Dijon, France) 2 INRA, UMR SAD-APT (Paris, France), 3 INRA, UMR SAD-Paysages (Rennes, France) * Corresponding author: 122 nd European Association of Agricultural Economists Seminar Evidence-Based Agricultural and Rural Policy Making Methodological and Empirical Challenges of Policy Evaluation February 17 th – 18 th, 2011, Ancona (Italy) associazioneAlessandroBartola studi e ricerche di economia e di politica agraria Centro Studi Sulle Politiche Economiche, Rurali e Ambientali Università Politecnica delle Marche