Applying the Evidence Where to Now?. The Story So Far…. Back to the FOLIOz EBLIP-Gloss Course: So far we have ASKed the question “What evidence is there.

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

Enablement CERT - Community Enablement & Reablement Team.
Implementing NICE guidance
Health Promotion.
Paul Vaughan National Project Manager HCA Initiative, WiPP OVER TO YOU! BUILDING ON THE WORK OF WIPP.
Appraisal of Literature. Task 4 The task requires that you:  Obtain a piece of literature from a journal, book or internet source. The literature should.
Experiences of Patient and Public involvement in the Research Process Roma Maguire Senior Research Fellow Cancer Care Research Team School of Nursing and.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Making a Difference: Measuring Your Outcomes Montgomery County Volunteer Center February 4, 2014 Pam Saussy and Barry Seltser, Consultants.
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
Health librarians: developing professional competence through a Legitimate Peripheral Participation model Sara Clarke and Zoe Thomas Royal Free Hospital.
Does It Work? Evaluating Your Program
Developed by Tony Connell Learning and Development Consultant and the East Midlands Health Trainer Hub, hosted by NHS Derbyshire County Making Every Contact.
MOOCs and the Quality Code Ian G. Giles PFHEA Medical Education
Supporting Cancer Survivors - A New Aftercare System
Critical Appraisal Dr Samira Alsenany Dr SA 2012 Dr Samira alsenany.
Evidenced Based Practice; Systematic Reviews; Critiquing Research
Ethical Issues in Community Practice Do the ends ever justify the means?
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
1 Writing Outcomes Produced by Non-Instructional Subcommittee of Assessment Committee.
Slide 1 of 17 Lessons from the Foundation Learning provision for the new 16 to 19 Study Programmes Discussion materials Issue 2: The development of English.
PACINA Online A Mini-Tutorial on Reading and Interpreting a Published Information Needs Analysis facilitated by Andrew Booth, ScHARR, University of Sheffield.
Prepared by London Pharmacy Education & Training 2011 based on previous work by North Thames and South Thames Pharmacy Education & Training March 2000.
11 Reporting Outcomes Results and Improvements Produced by Non-Instructional Subcommittee of Assessment Committee.
Training and Learning Needs Analysis (TLNA) a tool to promote effective workplace learning & development Helen Mason, Project Worker, Unionlearn Representing.
Department of Public Health and Primary Care Health Needs Assessment in Prisons: The Professional View and the Client View Helen Thornton-Jones
1 Debra Thornton Clinical Librarian, Royal Preston Hospital James Allen Assistant Librarian, Stepping Hill Hospital, Stockport A Journal Club for Librarians.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Evaluation: A Necessity for the Profession, an Essential Support for Practitioners Linking Intervention with Outcome Bryan Hiebert University of Victoria.
Slide 1 of 19 Lessons from the Foundation Learning provision for the new 16 to 19 Study Programmes Discussion materials Issue 1: Attendance, retention,
Primary Mental Health Workers in Education Sarah Davies & Sarah Jones Promoting positive mental health and emotional well being of children and young people.
Delivering an effective elearning program for psychiatrists in Ireland – a framework for other health professionals Aoife Lawton, Systems Librarian Health.
Qualitative Evaluation of Keep Well Lanarkshire Alan Sinclair Keep Well Evaluation Officer NHS Lanarkshire.
1 October, 2005 Activities and Activity Director Guidance Training (F248) §483.15(f)(l), and (F249) §483.15(f)(2)
Skills for evidence-informed practice: Interactive workshop Cambridge 30 April 2009.
Too expensive Too complicated Too time consuming.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Group vs. individual therapy – which is best? Andy McEwen CRUK Health Behaviour Unit University College London.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 2 Translating Research Evidence Into Nursing Practice: Evidence-Based Nursing.
Service users at the heart of service evaluation USER FOCUSED MONITORING.
 Enhance the capacity of the Department of Family Medicine to train Residents to design, implement, and evaluate an EB/QI plan  Focus on:  Rural practice.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Running a Journal Club Ziba Nadimi Outreach Library Service Manager September 2010.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Making Better Decisions: Incorporating Evidence into your Practice Denise Koufogiannakis University of Alberta Libraries OLA Superconference -- February.
Self Directed Support Personalisation for Providers.
CHAPTER 28 Translation of Evidence into Nursing Practice: Evidence, Clinical practice guidelines and Automated Implementation Tools.
Skills for evidence-informed practice: Interactive workshop Dartington Hall, Devon 2 April 2009.
Clinical Practice Guidelines By Dr. Hanan Said Ali.
Impact MSQ FOLIO course. What is impact? ‘ Impact’ is the difference that is made to individuals, organisations or communities by library services (1)
Michal Fedeles, PhD Director, Continuing Health Education, Adjunct Professor, Faculty of Health Sciences Simon Fraser University Céline Cressman, MSc Collaborator,
Extending the librarian role A Conversation Briefing with Linda Ward, Library Services Manager, University Hospitals of Leicester NHS Trust.
EAST LONDON HEALTHIER CHILDREN, HEALTHIER PLACE: Local authority self–assesment toolkit for reviewing whole systems approaches to tackling childhood obesity.
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
Evaluating Service Users’ Perspectives of Coventry City Council’s Individual Budgets Pilot.
Impact FOLIOz MSQ course. What is impact? ‘ Impact’ is the difference that is made to individuals, organisations or communities by library services (1)
Day 6 of the EBLIP-Gloss FOLIO Course: “Ask” A SPICE Tutorial Revised August 2009.
Individual Placement & Support Project Dr Louise Thomson.
Not Just “MK-1” How learning the skills of EBM relates to the pediatric milestones Martha S Wright, MD, MEd Rainbow Babies and Children’s Hospital.
CHAPTER 2 LITERATION REVIEW 1-1. LEARNING OUTCOMES 1.The reasons for a literature review being an essential part of every project. 2.The purpose of a.
ChiMat user survey and feedback: highlights ChiMat Board Meeting – 29 March 2010.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Chapter 23: Overview of the Occupational Therapy Process and Outcomes
Working effectively as a team.
How to show your social value – reporting outcomes & impact
Overview for Placement
APHA 135th Annual Meeting and Expo November 3-7, 2007 Washington, DC
Wendy Schudrich, LMSW Charles Auerbach, PhD, LCSW-R
Unit 6 Research Project in HSC Unit 6 Research Project in Health and Social Care Aim This unit aims to develop learners’ skills of independent enquiry.
Training & Program Delivery Gear Meeting 2 presentation
Presentation transcript:

Applying the Evidence Where to Now?

The Story So Far…. Back to the FOLIOz EBLIP-Gloss Course: So far we have ASKed the question “What evidence is there that Information Skills Clinics are providing the same level of tuition or better than the traditional one-to-one format?” We have ACQUIREd some evidence looking at different group sizes and their subsequent use of skills. We have APPRAISEd the evidence using the RELIANT Checklist.

How do we APPLY the evidence Ideally we would like Evidence that is Directly Applicable. However more commonly we will encounter Evidence that needs to be Locally Validated, perhaps through a survey or audit of local services. In our general reading we will encounter Evidence that Improves Understanding. (Koufogiannakis and Crumley, 2004) A final category of useful material is Evidence that may inform our Choice of Methodologies, Tools or Instruments (Booth, 2004)

As Koufogiannakis and Crumley state: “"When using research to help with a question, look for high quality studies, but do not be too quick to dismiss everything as irrelevant. Try to take what does apply from the research and use it to resolve the problem at hand" (Koufogiannakis and Crumley, 2004)

In our case… The study we found (Ayre, 2006) has a good match (Directly Applicable) to some of our target population (part-time NHS employees). However we need to investigate whether it applies equally to our full-time international students (Locally Validated) Undoubtedly it Improves our Understanding of issues relating to local versus centralised delivery of information skills training. The investigation of “subsequent use” is an interesting Choice of Methodology. However we would probably choose to investigate additional outcomes (e.g. skills acquisition). This will require identification of an alternative/additional Tool.

When considering Applicability think SCOPE Severity – How urgent/important is the problem? Clients – Does the planned intervention fit with the values, needs and preferences of my users? Opportunity – Is now the time to apply this? Has the situation changed since the evidence was produced? Politics – Is there local support for this intervention? Economics – Can we afford this intervention? Will this be at the expense of something else?

We now apply the SCOPE approach to our question

Severity Numbers of Masters students are increasing It will become prohibitive on staff-time to attempt to train all in one-to-one sessions However not all will require one-to-one training We will need to continue to offer one-to-one training Therefore it is not necessarily a choice of “one NOT the other”. More a case of “appropriateness”

Clients Some students prefer personalised topic-specific attention of one-to-one training Others prefer anonymity of small group training We need more local information on cultural issues We have identified a gap in our knowledge prior to implementation Library staff find it more time-efficient to train small groups BUT generally believe one-to-one training to be more “effective” (anecdotal)

Opportunity The evidence is up-to-date and reflects current practice Now is a good time to pilot this because we have resources to run group clinics and one-to-one training in parallel However one potential alternative is e- learning which allows students to learn at own time and pace (perhaps with tutorial assistance)

Politics Staff on the Masters (MPH) course welcome the clinics initiative as a systematic provision for their students promoting equity Full extent of one-to-one training is almost “invisible” to the organisation at present Information Specialists also support research and consultancy – may become more “available” if they can channel their one-to-one audience towards scheduled clinics

Economics Each clinic involves 2 Information Specialists for up to 2 hours (Compare 1 Information Specialist for 1 hour for one-to-one sessions) If equally effective then need at least 4 participants per clinic to “break even” (4 person hours) although students will be receiving more intensive input One-to-one training takes place in Library (taking up a library PC) whereas clinics use computer lab (currently free but may deny other students access during clinics)

Summary There is not sufficient evidence to favour one intervention over the other at present However need to know more about student preferences and (especially) relative effectiveness SUGGESTED ACTION: Administer pre- and post-tests to students after clinics and one-to- one sessions and compare score increases. Survey students on preferences.

Resources View the Applicability Checklist from the Libraries Using Evidence Toolkit %20Checklist.pdf %20Checklist.pdf –This covers: User Group, Timeliness, Cost, Politics and Severity Wilson, V. (2010). Applicability: What Is It? How Do You Find It?. Evidence Based Library And Information Practice, 5(2). Retrieved September 11, 2010, from ex.php/EBLIP/article/view/8091http://ejournals.library.ualberta.ca/ind ex.php/EBLIP/article/view/8091

References Ayre S (2006) Workplace-based information skills outreach training to primary care staff Health Information and Libraries Journal 23 (s1), 50–54. Booth A (2004) What research studies do practitioners actually find useful? Health Information and Libraries Journal 21 (3), 197– 200. Koufogiannakis D & Crumley E (2004) Applying evidence to your everyday practice in Booth A & Brice A (2004). Evidence-based Practice for Information Professionals: a handbook. London, Facet. Chapter10 pp