Using knowledge utilisation theory to demonstrate how commissioned evaluations can influence program design and funding decisions Case studies from consultancy.

Slides:



Advertisements
Similar presentations
Potential impact of PISA
Advertisements

HELPING THE NATION SPEND WISELY Performance audit and evaluation: common ground with Internal Audit ? The UK National Audit Office experience Jeremy Lonsdale.
Poster & Project Presentations The Robert Gordon University
Types and models of research impact Sandra Nutley University of St Andrews.
Knowledge transfer to policy makers (with apologies to John Lavis!)
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
Improving how your organisation supports the use of research evidence to inform policymaking.
Hillsmeade Primary School Term Teacher Professional Leave These PD and focus group sessions are designed to assist all staff to gain an understanding.
Putting Research Evidence to Work Research Seminar 14 th January 2009.
Connections and issues in the European evaluation context Professor Murray Saunders President of the European Evaluation Society.
Methods and good practices for influential evaluations Uganda Evaluation Week 2014.
Ray C. Rist The World Bank Washington, D.C.
Government Social Research Unit HOW CAN RESEARCH CONTRIBUTE TO POLICY? SRA, IRELAND 13 JANUARY 2006 Sue Duncan Chief Government Social Researcher.
Preconceptions, power and position: researcher reflections on public involvement in research Katherine Pollard, David Evans, Jane Dalrymple Margaret Miers,
Business research methods: data sources
MSP course 2007 Phase 0 – Setting up Kumasi, Ghana 2008 Wageningen International.
Confirmation of Candidature Writing the research proposal Helen Thursby.
Philip Davies Using Evidence for Policy and Practice Philip Davies International Initiative for Impact Evaluation [3ie] BCURE Evidence-Informed.
Evaluation. Practical Evaluation Michael Quinn Patton.
E TENDERING MASTERCLASS ANN MCNICHOLL TENDERING MASTERCLASS ANN MCNICHOLL COMMISSIONING MASTER CLASS – LEWISHAM Date 24 th September Presenter Ann McNicholl.
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
Influencing Decision Makers Directory of Social Change.
Designing Influential Evaluations Session 2 Topics & Timing Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
Factors that Influence Evaluation Utilisation. Theoretical Perspectives Michael Quinn Patton’s ‘Utilisation Focused Evaluation’ Addresses issue of use.
REVIEW AND QUALITY CONTROL
North East Regional Meeting 13 March 2014 Chris Chart POLICY OFFICER Policy Up-date.
Corporate Services Grants Programme 2013 – August 2012.
Reporting and Using Evaluation Results Presented on 6/18/15.
Communication Key Skills INSET. Outline of INSET training 1. A review of the standards for all levels of communication key skill 2. Examples of portfolios.
Assessing Capabilities for Informatics Enabled Change: The LISA Toolset Informatics Capability Development LISA – Local Health Community Informatics Strategic.
Parliamentary Committees in Democracies: Unit 4 Research Services for Parliamentary Committees.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
Strategic Commissioning
SRA course: Research into policy & practice Research utilisation & impact: some theory & insights.
What counts as ‘evidence’? The complexities of providing evidence to inform public policy Sally Shortall, School of Sociology, Social Policy and Social.
Too expensive Too complicated Too time consuming.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Genuine user involvement: what does it look like? Alison Faulkner Mental health researcher & service user.
Informing Reform: Changing The Relationships Between Research, Policy and Practice in Context Some Observations from the Work of the Consortium on Chicago.
RESEARCH AND SOCIAL CARE PAUL McGILL STRATEGIC RESEARCH OFFICER, CARDI 16 MAY 2013 CARDI Presentation.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
SSRG Annual Workshop 2008 SCIE’s role in making a difference Julie Jones Chief Executive, SCIE 9 April 2008.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
QAA Scotland: Focus on Assessment and Feedback Policy and Practice Summit Professor John W Sawkins, Deputy Principal (Learning and Teaching) Heriot-Watt.
MERTON LOCAL INVOLVEMENT NETWORK MEETING 27 March 2008 Richard Poxton Centre for Public Scrutiny National Team.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
UKPopNet Workshop 1 Undertaking a Systematic Review Andrew S. Pullin Centre for Evidence-Based Conservation University of Birmingham, UK.
We help to improve social care standards June Kathryn Chamberlain Area Officer Eastern.
FLAGSHIP STRATEGY 1 STUDENT LEARNING. Student Learning: A New Approach Victorian Essential Learning Standards Curriculum Planning Guidelines Principles.
BSc (Hons) Social Work Working Across Organisations Assessment Event Briefing. An overview of the teaching within this unit and briefing on the assessment.
Neighbourhood Planning Miles Thompson Shared Planning Policy Manager South Oxfordshire and Vale of White Horse District Councils
Governance and Commissioning Natalie White DCSF Consultant
Integral Health Solutions We make healthcare systems work in harmony.
Transforming Patient Experience: The essential guide
Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.
Patient And Public Involvement (PPI) in Research Dr. Steven Blackburn NIHR Research Design Service West Midlands (Keele University Hub)
The Proposal AEE 804 Spring 2002 Revised Spring 2003 Reese & Woods.
Kathy Corbiere Service Delivery and Performance Commission
EVIDENCE BASED POLICY RECOMMENDATIONS – TAKE AWAY LESSONS ON HOW TO PROGRESS EFFECTIVE ALCOHOL EDUCATION BETSY THOM Drug and Alcohol Research Centre MIDDLESEX.
Business Project Nicos Rodosthenous PhD 08/10/2013 1
Collaboration Between Researchers and State Policymakers: Models for Achieving Evidence-Informed Policy Andrew Coburn, Ph.D Muskie School of Public Service.
NIHR Themed Call Prevention and treatment of obesity Writing a good application and the role of the RDS 19 th January 2016.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Emerging Themes Assignment briefing – Dec 2012 & March 2013
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
WHO Collaborating Centre for Healthy Cities and Urban Health in the Baltic Region WP 1 – Integrating health considerations into spatial planning and development.
Engaging with End–Users of Research OCTOBER 2015 Cathy Harris MBA, MInstKT Research Engagement Manager Research & Enterprise Extn 6755,
Building Capacity for Quality Improvement A National Approach
Head of Policy & Engagement British Academy of Management
Students can fail professionally if they:
Presentation transcript:

Using knowledge utilisation theory to demonstrate how commissioned evaluations can influence program design and funding decisions Case studies from consultancy Wendy Hodge, Principal Consultant

This paper 2 1.Knowledge utilisation 2.The case studies

A very brief potted history of knowledge utilisation 3 “ The results of research are worthless if the are not used” (Last 1989)

Focus on evidence based policy Assumes that using knowledge will lead to better policy, programs Initiatives: Topic specific centres of excellence with structural links to government Systematic reviews of evidence e.g. Cochrane Collaboration Clearing Houses e.g. evidence and practice guidelines and summaries 4

Operating assumptions Knowledge is transferred from individual to individual and through organisational structures All knowledge is taken up subjectively Not just users and researchers who influence use of knowledge Knowledge is refined and adapted by the user Use and generation of knowledge are interdependent and complicated to improve Not all knowledge is intended to be directly applicable to policy development or program design 5

Ways evidence is used 6

Instrumental or direct use When findings/ data are used in specific or direct ways e.g. directly influence a program design or delivery, inform policy directions or professional practice Page 7

Conceptual use Involves using research evidence for general enlightenment. Users are exposed to new information, ideas but may not use the information directly Page 8

Symbolic or strategic use Using findings/ data to legitimise policy directions or to justify actions taken for other reasons. Page 9

Predictors of use “ Data are no use if the report on them is too late. They are precious little good if the relevant audience does not comprehend them” (Cronbach, 1977) 10

Predictors of use (dissemination model) Decision-makers know about the research Interdependence of policy makers and evaluators e.g. organisational links exist or joint planning Good personal relations between key players The right people - credible source The right evidence at the right time The inherent quality of evidence Whether the evidence conforms to commissioners beliefs and previous knowledge Whether data is interpreted in a way that suits the needs of the user Tells the story - clear, succinct formats, understood, user friendly 11

“Context matters-values matter – politics matter.” Brewer, 1983 “The interplay between science and policy is commonly neither purely instrumental nor purely political” Hertin et al

Organisational and political predictors of use structure, culture and politics of the user organisation including assumptions about a program or policy’s worth and service models rewards and incentives for dissemination activity in both the “user” and “researcher” context value placed on evaluation or research evidence in the user context Other inputs on policy development or program design; lobbying, negotiations Boundaries of policy assessment analysis 13

ARTD cases considered Evaluation of drink driver education program Evaluation of a carers program Evaluation of drug education program 14

Case 1 – Evaluation of a drink driver education program Predictor of useThe evaluation Findings known to decision makersHigh-level interagency senior officer committee + report tabled in parliament Organisational linksContract+ interagency steering committee for project involved in planning discussion of findings Good relationsNone at start but built over 2 years Credible sourceUs + academic advisor + guru The right evidence-the right timeQuasi-experimental design + mixed methods. Timed to meet budget cycle. User friendly reportEvidence synthesised and report structured around evaluation questions Assumptions of program worthProgram valued; design based on best evidence Value placed on evaluationHighly valued; direct client research background 15

Case 2 – Evaluation of a carers program Predictor of useThe evaluation Findings known to decision makers Responsible officers commissioned evaluation. Able to drive changes to delivery models. Organisational linksContract provided formal structure. Good relationsFostered by regular informal reporting of progress and findings. Credible sourceSought evaluation specialists. Previous knowledge of area and experience in conducting large reviews. The right evidence-the right time Extensive consultation with carers and service delivery organisations. Findings delivered in time to inform renewal of 3-year contracts User friendly reportExecutive summary identified deficiencies of service model and suggested changes. Report told the story of carers and what respite was needed. Assumptions of program worth High given vulnerable nature of the carers...fits with national priorities Value placed on evaluationModerate to high...previous bad experience 16

Case 3 – Evaluation of an action enquiry as professional development Predictor of useThe evaluation Findings known to decision makersResponsible offices commissioned evaluation. Able to drive changes to program structure. Organisational linksContract provided formal structure and Premiers Panel. Good relationsFostered by regular informal reporting of progress and findings. Credible sourceLong standing clients. The right evidence-the right timeQualitative methods. Findings reported verbally initially to inform stage 2 planning User friendly reportAnswered evaluation questions; placed in context of adult learning principals Assumptions of program worthNew approach, testing Value placed on evaluationHigh, value independence 17

In summary Evaluators generate knowledge Our clients, policy officers and program designers are users and disseminators of knowledge in their own sphere As evaluators, we need to pay attention to predictors of use under our control Policy officers transform evidence to meet their needs Policy officers could also actively pay attention to predictors of use within the agency context 18

Wendy Hodge Principal Consultant Contact details