Download presentation
Presentation is loading. Please wait.
Published byDiana Bond Modified over 9 years ago
1
Providing Consultancy & Research in Health Economics Julie Glanville, York Health Economics Consortium, UK Anna Noel Storr, Cochrane Dementia and Cognitive Improvement Group Gordon Dooley, Metaxis, UK Ruth Foxlee, Cochrane Editorial Unit June 2014 Improving rapid access to reports of RCTs from Embase: innovative methods to enhance the Cochrane Central Register of Controlled Trials (CENTRAL)
2
Providing Consultancy & Research in Health Economics Presentation Overview Background Objectives Methods Progress so far Challenges The future
3
Background Health technology assessments ranging from rapid reviews to the most extensive projects, rely on the efficient identification of research evidence In particular the evidence from randomised controlled trials (RCTs). The largest single source of RCTs is the Cochrane Central Register of Controlled Trials (CENTRAL) available as part of The Cochrane Library Cochrane Library is a subscription service which may be made available to users via organisational, regional, national or international funding arranagements
5
The project and its objectives The Cochrane Collaboration commissioned the Embase update project in March 2013 Project is undertaken by a consortium of three organisations the Cochrane Dementia and Cognitive Improvement Group Metaxis, UK York Health Economics Consortium. University of York, UK Objectives To identify reports of RCTs and controlled clinical trials from Embase for more rapid availability in CENTRAL
6
Methods, 1 We developed and validated a sensitive search filter to identify reports of RCTs Using textual analysis of 10,000 Embase RCT records (published 2000-2010) in Simstatw and Wordstat Identified terms, phrases and grouped terms within relevant records which could be tested in filters Pragmatic approach was used to select and test search terms Following this testing a second set of 10,000 Embase RCT records from CENTRAL was obtained and the best candidate filter was validated against that set of records in Ovid Embase At this point the records missed by the filter during the validation testing were reviewed to understand better why the records were missed This exercise led to some further changes to the filter The final filter was then validated on a third new set of 10,000 RCT records from CENTRAL
7
Progress The validated search filter identifies reports of RCTs in Embase with over 97% sensitivity An analysis of the records retrieved has resulted in a tiered record assessment process The most obvious RCT reports are fast-tracked into CENTRAL Animal studies are set to one side for team assessment The less obvious RCT records are assessed for relevance by a novel use of internet crowdsourcing “…the practice of obtaining needed services, ideas, or content by soliciting contributions from a large group of people, and especially from an online community, rather than from traditional employees…” Record screening software written by Metaxis Between two and six people assess whether a record is really a report of an RCT
8
Embase weeks 14, 15, 16, and 17: April results Tier 1 1469 records go straight into CENTRAL Tier 2 8619 records go to the crowd Tier 3 approx 400 conference abstracts and animal studies assessed by project team Embase weeks 14, 15, 16, and 17: April results
9
Screening tool
10
Progress: key metrics Number needed to read (NNR) = 34 Unsure records are 5% of those screened MetricNumber Screeners who have created an account450 Screeners who have completed training241 Records screened49092 Records accepted1463 Records rejected47474 Records unsure2359
11
Progress: accuracy Sample of records screened using the crowd was re- screened by an expert Expert (Anna Noel-Storr) acted as the reference standard RCT or CCT = 416 (+) Not RCT or CCT = 2654 (-) Crowd had a sensitivity of 99.8% Crowd has a specificity 99.8% Incorporation bias – Anna not blind to index test results
12
INDEX TEST Progress: are screeners getting better?
13
Progress: how long to screen a record? All screeners: Just under 1 minute per record Screeners who have screened more than 100 records: 42 secs/record
14
Challenges Conference abstracts Animal studies A lot of animal studies are tagged Human/ Developing an animal filter for EMBASE Deciding what is an RCT – guidance for screeners ‘Motivating the crowd’ ‘Certificates’ – exploring how to tell screeners how many records processed, more metrics and visuals Personalised thank you’s Community building – Facebook and twitter Enabling screeners to screen records of interest to them
15
The future Ever improving currency of Embase record availability in CENTRAL The number of irrelevant and duplicate records will be fewer Searchers will be able to identify more RCTs more accurately than previously by a rapid search of CENTRAL Please visit our project website http://www.metaxis.com/embasepublic/ http://www.metaxis.com/embasepublic/ Feel free to join the crowd! http://www.metaxis.com/embase/login.php
16
Providing Consultancy & Research in Health Economics http://tinyurl.com/yhec-facebook http://twitter.com/YHEC1 http://www.minerva-network.com/ Thank you julie.glanville@york.ac.uk Telephone: +44 1904 324832 Website: www.yhec.co.uk
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.