Independent evaluation: the experience of undertaking policy evaluations commissioned by the Department of Health in England Nicholas Mays Director, Policy.

Slides:



Advertisements
Similar presentations
September 1, 2010, Brussels EPP Hearing The Swedish universities perspective on Simplifying the implementation of the research framework programmes Pam.
Advertisements

UK Approach to Transposition The Guiding Principles for EU legislation.
Research Design Service West Midlands RfPB Research Funding Application Workshop 28 th February 2014.
Introducing the East Midlands Academy A membership organisation - all 24 [current] East Midlands NHS organisations have agreed to support it and fund it.
Information and Communication Technologies (ICT) in the Seventh Framework Programme Support actions.
Improving how your organisation supports the use of research evidence to inform policymaking.
1-1 PRESENTER The Role of the Framework 7 Advisor Your Name Your Websites Websites
Highly Specialised Technologies Evaluations
Efficiency Improvement Programme Survive, Re-Grow & Thrive! Peer-Led Challenge.
Serving skate: achievements and challenges for clinical audit Nick Black Chair National Clinical Audit Advisory Group 27 April 2010.
Health and Work Development Unit 2011 Implementing NICE public health guidance for the workplace: Implementation and audit action planning toolkit.
Facilitators: Janet Lange and Bob Munn
Rocky Harris Department for Environment, UK Ecosystems accounting in the UK UNCEEA Rio de Janeiro June 2012.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
CADTH Therapeutic Reviews
1 SOCIAL RESEARCH ASSOCIATION IRELAND/IRISH EVALUATION NETWORK SEMINAR GOOD PRACTICE IN COMMISSIONING RESEARCH AND EVALUATION 10 TH JANUARY 2006.
Policy, Information and Commissioning Group Department of Health and Human Services Tasmanian Health Organisations David Nicholson and Alex Tay Department.
Copyright c 2006 Oxford University Press 1 Chapter 7 Solving Problems and Making Decisions Problem solving is the communication that analyzes the problem.
IDENTIFICATION AND CONSULTATION WITH DATA USERS – TANZANIA EXPERIENCE Presentated by Irenius Ruyobya National Bureau of Statistics Tanzania September,
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
IAEA International Atomic Energy Agency How do you know how far you have got? How much you still have to do? Are we nearly there yet? What – Who – When.
Assessing Capabilities for Informatics Enabled Change: The LISA Toolset Informatics Capability Development LISA – Local Health Community Informatics Strategic.
From Conformance to Performance: Using Integrated Risk Management to achieve Organisational Health Ms Stacie Hall Comcover National Manager.
Birmingham Changing Futures Together- Research and Evaluation Services Talent Match Birmingham & Solihull – Research and Evaluation Services Bidders Presentation,
ACCESS TO UK RESEARCH OUTPUTS The developing RCUK position
Future Directions for the Public Health Division Future Directions for the Public Health Division Presentation to the Annual General Meeting Association.
Support Systems for Indigenous Primary Health Care Services Alister Thorpe, Kate Silburn #, Ian Anderson 23 March 2010 # La Trobe University.
GOVERNMENT OF ROMANIA MINISTRY OF PUBLIC FINANCE MANAGING AUTHORITY FOR COMMUNITY SUPPORT FRAMEWORK Evaluation Central Unit Development of the Evaluation.
KT-EQUAL/ CARDI Workshop: ‘Lost in Translation’ 23 June 2011 Communicating research results to policy makers: A practitioner’s perspective.
Financial sustainability of local authorities Presentation to Budget & Finance Scrutiny Select Committee 13 March 2013.
Better Business Cases “Investing for change” Overview
Reflections on the Independent Strategic Review of the Performance-Based Research Fund by Jonathan Adams Presentation to Forum on “ Measuring Research.
Review of Scrutiny Activity 2004 – 2006, Scrutiny Protocol and the Scrutiny Programme Ian Munro Chair of the Scrutiny Panel.
Quality Assurance. Identified Benefits that the Core Skills Programme is expected to Deliver 1.Increased efficiency in the delivery of Core Skills Training.
APTIS Spring Conference NICIE as SECTORAL SUPPORT Body.
SSHRC Partnership and Partnership Development Grants Rosemary Ommer 1.
On the organization and conduct of expert examination in science and technology in the USA and the European Union Scientific Research.
Page 1 RESEARCH EXCELLENCE FRAMEWORK : RESEARCH IMPACT ASESSMENT LESSONS FROM THE PILOT EXERCISE Professor John Marshall Director Academic Research Development.
SECTION 1 THE PROJECT MANAGEMENT FRAMEWORK
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
HORIZON 2020 European Commission Research and Innovation First stakeholder workshop on Horizon 2020 Implementation Brussels, 16 January 2015.
Round Table: International Experiences in Research Governance Patricia Pitman June 10, 2008.
Evaluation in R&D sphere in Ukraine: Real practice and problems of transition to new standards Igor Yegorov Centre for S&T Potential and Science History.
Overall Quality Assurance, Selecting and managing external consultants and outsourcing Baku Training Module.
SOCIAL BUSINESS PLAN. SOCIAL BUSINESS  Social enterprise is a business that trades for a social purpose. The social aims of the business are of equal.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
The Roles of Users in Enhancing Utility of Evaluation With Special Reference to the International Evaluation on the Funding and Management Performance.
11 The CPCRN, DCPC, NCI, and the Community Guide: Areas for Collaboration and Supportive Work Shawna L. Mercer, MSc, PhD Director The Guide to Community.
Personal Independence Payment Rob O’Carroll, Nick Smith, Claire de Banke, Craig Isherwood Department for Work and Pensions nawra national association of.
Procurement & Distribution Interest Group Symposium 10 th June 2010 Beth Loudon – Business Development Manager.
SEVAL WORKSHOP 3 – September 4, 2015 Health Policy Evaluation: what role for independence a perspective from WHO Dr Elil Renganathan DG Representative.
Information session first joint ERANID call Department of Health Eligibility Guidance for UK Researchers Policy Research Programme, Department of Health,
Dr. Salwa B. El-Magoli 16/1/2007Dr.Salwa B. El-magoli Cairo: 16/1/2007 Quality Assurance and Accreditation (The Egyptian Experience) Dr. Salwa B. El-Magoli.
 SHS REVIEW Duncan Gray, Senior Statistician, Housing Statistics, Scottish Executive.
1 SBIR/STTR Overview Wang Yongqiang. 2 Federal SBIR/STTR Program ‣ A +$2Billion funding program set-aside for small businesses seeking to early stage.
E u r o p e a n C o m m i s s i o n - W a r s a w 2002 Sixth Framework Programme Instruments.
Family Carers:How do they do it? How can we help? Ghzala Ahmad.
TCS Multi-Professional Leadership Challenges Registration Guidance Document page 1 of 3 TCS Multi-Professional Leadership Challenges for all Primary &
The Workforce, Education Commissioning and Education and Learning Strategy Enabling world class healthcare services within the North West.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
The United Kingdom experience in data collection and statistics on disability Ian Dale Head of Disability Analysis Department for Work and Pensions Steel.
Subcontracting funding rules: scenarios
The Innovation Pathway I+R=CE2
Peer learning of Innovation agencies
Coaching and Mentoring Centre of Excellence (CoE) HR/OD SMT Proposal Presented by: Melanie Lloyd & Karen Gallagher 22 December 2015.
Safeguarding Objective Decision making
Dr Peter Groves MD FRCP Consultant Cardiologist
NICE has many methods and processes
Implementing UK Housing Data Standards
CEng progression through the IOM3
Presentation transcript:

Independent evaluation: the experience of undertaking policy evaluations commissioned by the Department of Health in England Nicholas Mays Director, Policy Innovation Research Unit Department of Health Services Research & Policy Workshop on ‘New approaches to health policy evaluation: what role for independence?’, Congress of the Swiss Evaluation Society (SEVAL), 4 September 2015 Improving health worldwide

What does ‘independence’ mean in research and evaluation? Means different things to different participants (academics, think tanks, commercial agencies, not-for-profits) Can be defined in terms of structure, source of finance (e.g. endowment, government, charitable, commercial), ideology, ethos, ability to choose what to study and how to approach it (intellectual freedom) How independence is ‘framed’ can affect degree of influence I see it as always a relative concept, along a series of continua – always a (different) set of constraints (e.g. performance metrics in universities including to raise research funding) or incentives/disincentives & rewards

DH Policy Research Programme A national programme of independent research since 1970s to provide evidence that can be used by DH officials to advise their Ministers Based on simple ‘customer-contractor’ model, mostly uses university researchers Three main modes of (competitive) funding: – Policy Research Units, usually 5 years, potentially renewable – initiatives, consisting of groups of linked, commissioned projects – single commissioned projects

How DH typically commissions evaluations 1.Detailed invitation to tender inviting proposals from external research teams 2.One or two-stage proposal process 3.External peer reviews and comments from DH policy ‘customers’ 4.Response to reviewers’ reports 5.Assessment of proposals by commissioning panel of officials and academics, usually independently chaired 6.Recommendation to minister of successful team (and/or negotiation of refinements of original proposal) 7.Contract, including rights to publish after 28-day notice/comment period, with maximum 3-month delay – ‘Publication of scientifically robust research results is encouraged’ month process if two stages

Assessment criteria RELEVANCE of the proposed research to the research brief QUALITY of the research design QUALITY of the work plan and proposed management arrangements STRENGTH of the research team IMPACT of the proposed work VALUE for money (justification of the proposed costs) INVOLVEMENT of patients and the public

Assumptions underpinning the DH approach External, independent evaluation is superior to the alternative Objectivity is a realistic and important goal Conflicts of interest should be minimised by ensuring that evaluators are not involved in what they are studying – No action-research or ‘engaged’ form of evaluation Emphasis on (mainly positivist) ‘science’ Centrality of peer review at all stages Evaluation can remain relevant, timely and applicable while showing the above features – no tensions between ‘independence’ and applicability DH and Ministers are the dominant ‘customers’

Advantages Evaluation not subject to day to day interference from government – formal rules of engagement Evaluators can report what they find Evaluators can focus on research skills rather than being expected to be implementers, management consultants, etc. Disadvantages DH can comment in ways that may put researchers under pressure to self-censor – researchers have to determine their own balance between being ‘helpful’ and ‘unresponsive’ Requires expert intermediaries Works best with a stable ‘customer’ group Experience of the DH approach from the point of view of different participants

Advantages Value for public money in that findings are published Degree of ‘independence’ maintains the value of using external evaluators – possible to provide some challenge function Disadvantages Can appear slow & cumbersome to busy policy officials – Researchers can be seen to be ‘remote’ Still tensions when findings do not correspond to policy makers’ expectations Accusations of ‘policy-based evidence’ since researchers do not choose topics Experience of the DH approach from the point of view of different participants

Advantages Government can easily distance itself from unpalatable findings Disadvantages Can be used to maintain a façade of EBP – ‘independence’ no guarantee of greater use of findings – can undermine engagement with policy-relevant issues Ignores 60 years’ experience of more ‘engaged’ forms of evaluation Local implementers may not see value of ‘national’ evaluations – may not cooperate Experience of the DH approach from the point of view of different participants

Different stakeholder perspectives on pilots and evaluations Other Government departments HM Treasury Local stakeholders, e.g. politicians, frontline staff; providers; users/patients Academic community, e.g. peer reviewers, commentators