Scoring Goals or Changing the Game: What Impacts Should We Measure? Jonathan Lomas Canadian Health Services Research Foundation Presentation to ESRC Symposium:

Slides:



Advertisements
Similar presentations
Evidence-Based Policy at the Cabinet Office
Advertisements

Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Towards Science, Technology and Innovation2/10/2014 Sustainable Development Education, Research and Innovation Vision for Knowledge Economy Professor Maged.
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Is research working for you? A self-assessment tool and discussion guide Maria Judd, MSc Senior Program Officer Knowledge Transfer and Exchange.
ESRC Symposium: New approaches to assessing the impact of social science research Grafton Hotel, London, May 12-13, 2005 Impact of Social Research – a.
Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable.
Bridging the gap between good practice principles and research study realities. Using case studies to build descriptors of the public involvement role.
Scaling-Up Early Childhood Intervention Literacy Learning Practices Maurice McInerney, Ph.D. American Institutes for Research Presentation prepared for.
The Business Support Professional Career Pathway Leonardo Partnership Management Meeting CECA´s headquarter Seville, Spain March 2010.
Knowledge transfer to policy makers (with apologies to John Lavis!)
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
I.T. Works Principal Investigator: Peter D. Blanck, Ph.D., J.D. Project Director: James L. Schmeling, J.D. Co-Investigator: Kevin M. Schartz, Ph.D., M.C.S.
Collaborations with the Third Sector: Achieving Impact from Research Susan Davidson, PhD Research Adviser Age UK.
Evidence based practice in road safety Jeremy Phillips Operations & Programmes Manager Sustainable & Safe Travel Team, Devon County Council Poppy Husband.
Appraisal of Literature. Task 4 The task requires that you:  Obtain a piece of literature from a journal, book or internet source. The literature should.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Relating research to practice Heather King Department of Education King’s College London.
Government Social Research Unit Philip Davies PhD Government Social Research Unit HM Treasury London SW1A 2HQ What Can Social.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Putting Research Evidence to Work Research Seminar 14 th January 2009.
Connections and issues in the European evaluation context Professor Murray Saunders President of the European Evaluation Society.
Translating evidence into practice Conference EMCDDA Lisbon, 6 – 8 may 2009 Prof dr H.F.L. Garretsen Tranzo, Tilburg University.
Evidence for ‘excellence in care’
Irish Evaluation Network Presentation The State of the Art of Evaluation in Ireland Dr. Richard Boyle Senior Research Officer Institute of Public Administration.
Challenge Questions How good is our operational management?
Evaluation. Practical Evaluation Michael Quinn Patton.
450 PRESENTATION NURSING TURNOVER.
Evaluating NSF Programs
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Building Research Capacity in social care: An untapped potential? Jo Cooke &Linsay Halladay University of Sheffield Others in the research team: Ruth Bacigalupo.
The role of the clinical librarian: can our experience of supporting clinicians be transferred to managers? Jacqueline Verschuere, Clinical Librarian.
KT-EQUAL/ CARDI Workshop: ‘Lost in Translation’ 23 June 2011 Communicating research results to policy makers: A practitioner’s perspective.
Canadian Cancer Society Manitoba Division: Knowledge Exchange Network (KEN) & CancerCare Manitoba Manitoba Integrated Chronic Disease Primary Prevention.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
Working Definition of Program Evaluation
‘What Works’ The role of evidence based policy and research in Britain’s welfare to work policies Professor Dan Finn University of Portsmouth.
Nef (the new economics foundation) Co-producing Lambeth what’s possible? Lucie Stephens and Julia Slay nef, October 2011.
Potential Roles for Health Technology Assessment Agencies: Opportunities and Challenges for an Effective Health Technology Assessment Practice at the Meso.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Evidence in Planning Presented at the Data Leading to Action-From Chaos to Clarity Symposium Winnipeg, January 19-20, 2009 Dexter Harvey.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Round Table: International Experiences in Research Governance Patricia Pitman June 10, 2008.
A translational routemap for public health research Peter Craig Programme Manager, MRC PHSRN Knowledge Transfer Scotland, Heriot Watt University, 23 April.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
This grey area will not appear in your presentation. Working in Community Laura Plett, R.N., M.Sc. Director Knowledge Exchange Network Canadian Cancer.
1 Government Social Research Unit Randomised Controlled Trials Conference University of York, September 2006 Why Governments Need.
General Capacity Building Components for Non Profit and Faith Based Agencies Lakewood Resource and Referral Center nd Street, suite 204 Lakewood,
1 Click to edit Master text styles Second level Third level Fourth level Fifth level Administrative Support for Large- Scale Funding Applications – Session.
Skills for evidence-informed practice: Interactive workshop Dartington Hall, Devon 2 April 2009.
Clinical Practice Guidelines By Dr. Hanan Said Ali.
Question-led mixed methods research synthesis Centre launch 21 June 2005 David Gough and Sandy Oliver Institute of Education, University of London.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Aarin Frigon, M.A. I’m sorry…are we speaking the same language? The importance of establishing evaluation relationships.
Dr. Shahram Yazdani 1 Policy Support Function Dr. Shahram Yazdani Shahid Beheshti University of Medical Sciences School of Medical Education Strategic.
Collaboration Between Researchers and State Policymakers: Models for Achieving Evidence-Informed Policy Andrew Coburn, Ph.D Muskie School of Public Service.
NIHR Themed Call Prevention and treatment of obesity Writing a good application and the role of the RDS 19 th January 2016.
What are Researchers Doing? Michael Jubb Research Information Network 3 rd Bloomsbury E-Publishing Conference 26 June 2009.
Recruiting for SME’s Adam Clark Level 3, 313 Burwood Road Hawthorn Vic 3122 Ph: (03) Mob:
Hotel Novotel Accor Calea Victoriei 37 B Sector 1 Bucharest, Romania Sub-brand to go here Pro Global Science Association International symposium Advancing.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
KTE Part B PPI Impact Momoko Sato NIHR DEC London.
Why Government Education Initiatives Work - or Don’t
Curriculum (Article 6) Teachers should be involved in all phases of curriculum development ..(design, piloting, implementation and review). Promote understanding.
Module 5 The Climate Expert and your role as a consultant
Understanding Impact Stephanie Seavers, Impact Manager.
Seminar on the Evaluation of AUT STEM Programme
Presentation transcript:

Scoring Goals or Changing the Game: What Impacts Should We Measure? Jonathan Lomas Canadian Health Services Research Foundation Presentation to ESRC Symposium: New Approaches to Assessing the Non-Academic Impact of Social Science London, May 12-13, 2005

My question is, are we having an impact?

Is Research Ready for Action? Medline search ( ] to identify articles stating: a. need more research or need less research Need more161/162 Need less 1/162 b. more questions than answers or more answers than questions More questions 163/166 More answers 3/166 David, AS. BMJ 2002; 323:1462-3

Assessing the Impact of What? The Research Produced? A single research study published in a journal A summary of some research studies written in plain language and posted on a web site A systematic review with targeted dissemination of key messages to potential users A body of research knowledge developed and discussed face-to-face with potential users

Assessing the Impact of What? (cont) The Research Production Process? All the activities of a research commissioning or granting agency (including training) All the activities of a research production facility (e.g. institute, department, university) The activities of a potential research user organization and its staff The entire research regime in a country

Where I Work CHSRFs mission: To support evidence-based decision-making in the healthcare system Our ultimate desired impact is cultural change in the research and healthcare systems Changing the game, not just scoring goals

The Discipline of Objectives & Logic Models Yogi Berra on objectives: If you dont know where youre going, you might not get there Lewis Carroll on logic models: If you dont know where youre going, any road will take you there

Decision Makers Policy Makers Organized Interests e.g. drug companies, professional associations Service Professionals Managers Client & Public Decision Maker Diversity

Why Social Science isnt IBM or General Electric Researchers University- based Stakeholder- based System- based Management Consultants Policy Makers Organized Interests e.g. drug companies, professional associations Service Professionals Managers Patient & Public Decision Makers PROBLEMS SOLUTIONS

Decision Makers Researchers Evidence-Based Decision-Making Research Funders Knowledge Purveyors Receptor Capacity Critical Evaluation PROBLEMS SOLUTIONS Funding and Training Vehicles Priority-setting structures ISSUES & PRIORITIES PRIORITY TOPICS IDEAS RESEARCH EVIDENCE Other Influences Personal Experience, Anecdote, Wants, Interests, Myths, Assumptions, etc. Synthesis and Influence Linkage and Exchange

CHSRFs Objectives 1. To increase health system decision-makers appreciation of the value of research 2. To increase the production of research relevant to the needs of health system decision- makers 3. To increase the availability and acquisition of needed research by health system decision-makers 4. To increase the appraisal and application of needed research by health system decision makers

Increased Appreciation of Value of Research Programs and activities: overall linkage & exchange approach consultations with users for priorities case study presentations of value Measurable outcomes: # of decision-makers participating in research expenditures on research commissioning by system % of employees in system with research training

Increased Production of Relevant Research Programs and activities: priority theme-based program-funding commissioned syntheses on current issues applied training programs encourage university incentives for applied research Measurable outcomes: amount of research in priority theme areas self-reported awareness/use of research syntheses # of graduates with applied research skills

Increased Availability/Acquisition of Research Programs and activities: plain-language research summaries (1:3:25, Mythbusters ) face-to-face exchanges on timely topics creation/support of knowledge networks creation/support for knowledge brokering Measurable outcomes: self-reported awareness of disseminated research self-reported follow-up contact with researchers self-reported use of web-based and other resources for research evidence acquisition (audit computer bookmarks)

Increased Appraisal/Application of Research Programs and activities: training users in research appraisal and application organisational best practices in research use Measurable outcomes: self-reported application of research changes in organisational structures and processes to better accommodate research Increased sense of decision-certainty where synthesised research is available

The Attribution Challenge What, if any, is/was the role of research versus all the other influences on behaviour?

Adapted from Philip Davies, 2005 Program or Intervention Effectiveness Types of Research Evidence Implementation Evidence Organizational Evidence Economic/ Financial Evidence Ethics Evidence Forecast Evidence Attitudinal Evidence Experimental Quasi-Experimental Counterfactual Surveys Admin Data Comparative Qualitative Cost-Benefit Cost-Effectiveness Cost-Utility Econometrics Experimental Quasi-Experimental Qualitative Theories of Change Public Consultation Distributional Data Multivariate Regression Surveys Qualitative

Adapted from Philip Davies, 2005 Types of Research Evidence Program or Intervention Effectiveness Implementation Evidence Organizational Evidence Economic/ Financial Evidence Ethics Evidence Forecast Evidence Attitudinal Evidence Experimental Quasi-Experimental Counterfactual Surveys Admin Data Comparative Qualitative Cost-Benefit Cost-Effectiveness Cost-Utility Econometrics Experimental Quasi-Experimental Qualitative Theories of Change Public Consultation Distributional Data Multivariate Regression Surveys Qualitative Research Evidence

Adapted from Philip Davies, 2005 Combining Research and Colloquial Evidence for Information Research Evidence Professional Experience & Expertise Political Judgement Resources Values Habits & Tradition Lobbyists & Pressure Groups Pragmatics & Contingencies

A Final Zany Idea The impact file - a deductive approach to assessing impact Add photo of impact file

Whats In the Impact File? T F H M & Y D G W - testimonials Altered career trajectory (researcher or decision-maker) Changes in similar organizations programs & processes - lateral impact Better communication of research and its implications - dissemination Awareness of research by decision-makers - acquisition Changed decisions based on research – application Changes in researchers or decision-makers structures and processes - cultural change

Maybe Settling for Second Best is OK! Using an impact file is: Motley, ad-hoc and not comprehensive or systematic Biased to individual not organizational responses Qualitative and potentially non-generalizable But, as Churchill said of democracy: Many forms of government have been tried … No one pretends that democracy is perfect … Indeed, it has been said that democracy is the worst form of government, except all those other forms that have been tried from time to time. House of Commons, 1947

THANK YOU! or