What evidence can help practice decisions about what works and what doesn’t? Elizabeth Waters Chair in Public Health School of Health and Social Development,

Slides:



Advertisements
Similar presentations
Measuring health outcomes of engagement in the arts: the Arts Health Strategy for the Australia Council.
Advertisements

Mixed methods synthesis ESRC Methods Festival 2006 James Thomas Institute of Education, University of London.
Chronic Care Training Needs Assessment (CC-TNA) Initiative Presented by:
Head Teacher Forum 23 June 2010 Managing your business! Code of Conduct Update Tina Renshaw – Regional Human Resources Manager.
How do we achieve cost effective cancer treatments in the UK? Professor Peter Littlejohns Department of Public Health and Primary Care.
Cultural Competency in Health Cultural Competency in Health.
Potential of Public Health Systematic Reviews to Impact on Primary Research Professor Laurence Moore September 2007.
AAP Department of Community Pediatrics Community Pediatrics Can it be taught, Can it be learned, Can it be practiced? “Caring, compassionate, and knowledgeable.
Translating evidence into practice Conference EMCDDA Lisbon, 6 – 8 may 2009 Prof dr H.F.L. Garretsen Tranzo, Tilburg University.
The Quality Challenge: The Early Years Strategy Nóirín Hayes Centre for Social and Educational Research
Summarizing Community-Based Participatory Research: Background and Context for the Review Lucille Webb, MEd Eugenia Eng, DrPH Alice Ammerman, DrPH Meera.
Risk Management and Strategy Prioritisation Intelligence Step 8 - Risk Management and Strategy Prioritisaiton Considering the risks associated with action.
Office for Planning, Strategy and Coordination Victorian Child and Adolescent Monitoring System Victorian Child and Adolescent Monitoring System Joyce.
Problem Identification
Lancashire Assessment and Planning Framework Victoria Gent.
Turning Marketing Information into Action: Marketing Research Chapter 8.
Office of Child Development & Early Learning | Touching Hearts, Changing Minds Parents as Professional Development.
-Why the diploma came out? -How the diploma is structured?
Their contribution to knowledge Morag Heirs. Research Fellow Centre for Reviews and Dissemination University of York PhD student (NIHR funded) Health.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Whilst the pharmaceutical industry plays a key role in developing and producing medicines, there is a tension between industry’s need to expand product.
© Crown copyright 2005 Primary National Strategy The effective use of support to promote positive behaviour and regular attendance.
Using motivational interviewing to improve social workers’ engagement of fathers in child protection Jonathan Scourfield, Cardiff University Nina Maxwell,
Getting it right for every child
Another New Framework Major Changes: No more satisfactory 2 strikes and you are out All criteria changed Very short notice No pre-inspection brief.
Multi-Agency Planning in Practice Skill development workshop.
Criteria for Assessing The Feasibility of RCTs. RCTs in Social Science: York September 2006 Today’s Headlines: “Drugs education is not working” “ having.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Why Theory Matters Jackie Green
Highlights from Educational Research: Its Nature and Rules of Operation Charles and Mertler (2002)
Understanding Need and Risk. GIRFEC History and Background –Numerous policies relating to Multi-Agency working Principles –Co-ordinated Support for Families.
Commissioning Self Analysis and Planning Exercise activity sheets.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
Raising standards, improving lives The use of assessment to improve learning: the evidence 15 September Jacqueline White HMI National Adviser for Assessment.
IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL EDUCATION IN ALBANIA IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL.
Ready to Raise PowerPoint Resource The Work of Early Years Community Developers Please feel free to adapt these PowerPoint slides to your needs. Credit.
Developing teaching as an evidence informed profession UCET Annual Conference Kevan Collins - Chief Executive
1 An introductory workshop for new governors 2005 Becoming a governor.
1 General Elements in Evaluation Research. 2 Types of Evaluations.
Welsh Neglect Project The findings from the Welsh Government commissioned Welsh Neglect Project November 2015.
Bridging the Research to Practice Gap: Perspectives from the Practice Side Ronnie Detrich Wing Institute.
The implications of poverty for educational effectiveness in all schools School Effectiveness & Socio-economic Disadvantage.
1 Highlights of a Systematic Review of Literature on Peer-Delivered Services Boston University Center for Psychiatric Rehabilitation June 2010.
How Empty Are Empty Reviews? The first report on the Empty Reviews Project sponsored by the Cochrane Opportunities Fund and an invitation to participate.
Comenius: Future School Leaders Action Research Carol Taylor.
Dual Relationships
The Management of Resources to Promote Equity. HGIOS 4.
CERTIFICATE IN ASSESSING VOCATIONAL ACHIEVEMENT (CAVA) Unit 1: Understanding the principles and practices of assessment.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
Development of Gender Sensitive M&E: Tools and Strategies.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Friday 1 st December 2006 Careers & Employment Workshop Group D: Practitioner.
Developing your research question Fiona Alderdice and Mike Clarke.
SDF Conference & Projects Fair 29 th October 2014 Rosie Kerr, Manager, North Lanarkshire Integrated Addiction Service Eleanor McDermott, Development Officer,
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Cochrane Agenda and Priority Setting Methods Group (CAPSMG)
TAIEX-REGIO Workshop on Applying the Partnership Principle in the European Structural and Investment Funds Bratislava, 20/05/2016 Involvement of Partners.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Working effectively as a team.
Ontario Early Years Child and Family Centres Planning Guidelines
Best Practice Systematic Review
Roadmap for Health in All Policies in Sudan
Partnership for Preparing for Adulthood
HEALTH IN POLICIES TRAINING
Strategies to increase family engagement
Gerald Dyer, Jr., MPH October 20, 2016
Aberdeen City Council Donna Cuthill – Parent Engagement.
Raising the bar Meeting Europe’s future challenges
Roadmap for Health in All Policies in Sudan
Presentation transcript:

What evidence can help practice decisions about what works and what doesn’t? Elizabeth Waters Chair in Public Health School of Health and Social Development, Deakin University Director, Cochrane Collaboration Health Promotion and Public Health Field NHMRC Health Advisory Committee

Background: Evidence in practice Research and evaluation long played a part in health and social policies >Sometimes more than practitioners would like >Sometimes less than researchers would like ‘Research informed’ always controversial 19 th Century – early proponents of gap between intention to do good and having a beneficial effect Evidence based practice has increased pressure on practitioners Increasing amount and varying quality makes integration of evidence into practice difficult

Why careful evaluation is crucial – a hypothetical 1 Evaluation of ‘discussion based groups for parenting’ program, >Designed well and carefully >Evaluation process and impact based on attendance, views of workers, and views of attendees, and observations of apparent child wellbeing >Work presented well – picked up by the media Before asserting that “discussion groups work” we need to be certain that 1.improvements have taken place 2.Brought about as a result of the discussion group

Difficult to be certain unless…. Mechanisms in place for ruling out competing explanations: >Parents might have improved with time and increased confidence. (Evidence that psych problems improve spontaneously in 2/3 of cases) >Children more manageable by spending time with playgroup workers >Other external factors, eg income support >Parental learning the ‘right’ things to say – familiar with expectations of workers >Participating parents highly motivated – or those dropped out might have done just as well. Problem is – we generally don’t know Furthermore, need to know what other studies have found and how robust they are. Could be chance that it worked…

Maybe… Other studies show it doesn’t work Only works in some settings For only some types of parents

What role high quality evaluations will play It is important that we know what we know, but that we know what we do not know. Lao-Tze, Chinese Philosopher (ca BCE) Building the evidence base in and for health promotion is essential Qualitative and quantitative information is crucial – what works for whom and why Most research and evaluations can only be understood in context – and a key part of that context consists of the results of other studies that tested the same hypothesis, in similar populations.

Caution on single studies The public, the media and even professional colleagues place great faith in single studies, however Graphing frequency distribution of effect size will show: >Some have unusually negative effects >Some show unusually positive effects Simply by chance Analogy to single surveys of anything

What role can systematic reviews play? Key source of evidence based information to >support and develop practice, >support professional development by helping to identify >new and emerging developments and >gaps in knowledge Systematic reviews provide a synthesis of robust studies which no policy maker or practitioner, however diligent, could possibly hope to read themselves.

Systematic reviews… Are not: >Reviews of studies you have been able to find >Reviews that support the policy or intervention you want to introduce >Reviews that leave out findings you don’t like or are inconclusive Can tell: >State of knowledge in an area >Any inconsistencies within it >Clarify what remains to be known

How evaluations will be used in systematic reviews? Systematic reviews seek to extract information systematically from studies which meet inclusion criteria New guidelines on Cochrane HP&PH Systematic Reviews recommend: Use of Advisory Group for international audience Policy relevant interventions Broad scope of study designs Theoretical framework employed Extraction and integration of process, impact and outcome data Cost related evaluation – cost consequence, cost effectiveness Equity and sociocultural – relative effectiveness or impact Integrity of intervention Public health ethics Recommendations for research and practice – content and methods

Evaluation and advocacy Significant proportion of health promotion unevaluated Recent Cochrane review on increasing participation in sport found NO evaluations….in context of large amounts of funding Evaluation is not an evaluation of those involved but an essential step to know that we are making a difference in a positive direction, or not. Content of systematic reviews dependent on primary research and evaluation

Conclusion Policy and program decisions are dependent on lots of things – but evaluations that can say what works, what doesn’t, for who and for how much, have an important role Ensuring we understand the relative impact of outcomes across and between population groups is essential Improving benefits and avoiding harm core – intuition and good intentions are not enough. Systematic reviews will provide a supportive partner to advocate for high quality evaluations.