Improving replicability in addiction research

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Dr Gordon Russell, Napier University Unit Data Dictionary 1 Data Dictionary Unit 5.3.
WM Software Process & Quality Generic Processes - Slide #1  P. Sorenson SPiCE Reference Model - how to read Chapter 5 Capability Levels (process.
Naviance Family Connection
PSAT Score Interpretation October 2012 Test Results and Interpretation January
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
STUDY PLANNING & DESIGN TO ENHANCE TRANSLATION OF HEALTH BEHAVIOR RESEARCH Lisa Klesges, Russell Glasgow, Paul Estabrooks, David Dzewaltowski, Sheana Bull.
Welcome to PARCC Field Test Training! Presented by the PARCC Field Test Team.
2 pt 3 pt 4 pt 5pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2pt 3 pt 4pt 5 pt 1pt 2pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4pt 5 pt 1pt FBA or FA AssessmentTerms Data Collection.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Measurement Cameron G. Thies University of Iowa. The Measurement Process What is measurement? – The process of assigning numbers or labels to units of.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Requirements Engineering Requirements Validation and Management Lecture-24.
Concept Mapping Nigel Riley MirandaNet Seminar 28 th November 2005.
The Basics of Social Science Research Methods
Introduction to Marketing Research
Understanding different types and methods of research
Reliability and Validity
Quality Assurance processes
Issues in Evaluating Educational Research
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Assist. Prof. Merve Topcu Department of Psychology, Çankaya University
Best Practice Systematic Review
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Meta-analysis: Conceptual and Methodological Introduction
Lesson 4 Cognitive Psychology.
The Context of Database Management
Facet5 Audition Module Facilitator Date Year.
Quality and purpose: quality in quantitative methods
A template for describing behaviour change theories
A psychological perspective on addiction
Supplementary Table 1. PRISMA checklist
The Year of Core Instruction
Assessment for Learning (AfL)
METHODOLOGY AND MEASUREMENT ASPECTS
AND WAYS TO SYSTEMATICALLY COLLECT IT
Journalism 614: Reliability and Validity
9. Introduction to signal detection
Managing performance What is it? Why? How?.
Databases and Information Management
Introduction to Degree works
Multi-Sectoral Nutrition Action Planning Training Module
Chapter Eight: Quantitative Methods
Statistics and Research Desgin
An Introduction to Quality Assurance in Analytical Science
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Problems with the Transtheoretical Model of Behaviour Change
10 Stages Of the Engineering Design Process
Reliability Internal External Test-retest Inter-rater
Building a Conceptual Flow
FBA or FA Data Collection Intervention Planning Assessment Terms
Reliability, Validity, and Bias
Improving replicability in addiction research
Databases and Information Management
Learning online: Motivated to Self-Regulate?
Spreadsheets, Modelling & Databases
Group Experimental Design
CTA Orientation Course
RESEARCH REPORT Presented By: Dr. Ajit Singh yadav
MONITORING AND EVALUATION IN TB/HIV PROGRAMS
Positive analysis in public finance
Assessment The purpose of this workshop / discussion is to extend further teachers’ understanding of the Department's Assessment Advice. This workshop.
Analyzing Reliability and Validity in Outcomes Assessment
Metadata on quality of statistical information
Meta-analysis, systematic reviews and research syntheses
USOAP Continuous Monitoring Approach (CMA) Workshop
Misc Internal Validity Scenarios External Validity Construct Validity
Software Architecture & Design
Methods (1) Sample section: Describe the procedure for selecting units (such as subjects and records) for the study and ensure that it is appropriate.
Presentation transcript:

Improving replicability in addiction research Robert West University College London

The problem Replicability is the extent to which reported study findings are repeated when what is believed to be essentially the same study is repeated It is hampered by Poorly defined constructs (e.g., craving, stress, recovery) Poorly specified models (e.g., unclear causal pathways) Weak measurement (e.g., use of self-report scales with low validity) Inconsistent/incomplete reporting (e.g., regarding intervention content) Inappropriate analyses (e.g., cherry picking comparisons) Bias in reporting and dissemination (e.g., file drawer problem)

Clarifying constructs Develop an addiction ontology setting out key terms, definitions and relationships using an established model Small section of putative addiction ontology

Specifying models Develop a common model building system that clearly sets out what constructs are included, how they are defined and how they relate to each other Transtheoretical Model expressed in standard form

Improving measurement Develop and use a database of measures with indices of reliability and validity Construct label Measure name Validity index Sources Cigarette craving QSU 7 Tiffany 1990 SUTS West 2010 State anxiety STAI 8 Spielberg 1980

Improving reporting Develop and use an online study reporting tool for study reporting to ensure that key information is presented in a consistent way Meta-data Research questions Study ID: ADD-17203 Design Participants Select which type of study feature you want to enter, then you will taken through a step-by-step process to enter the information Setting Groups Measures Analyses Findings Interpretation

Improving analysis Promote the use of the Open Science Framework and other resources to pre-specify analysis plans and record analysis history

Reducing reporting bias Extend trial registration to all major study types and establish an efficient system for linking study reports to registered protocols, automatically recording protocol deviations Extend competing interest reporting to non-commercial types and automate the process of coding these to allow their assessment in meta-regressions

Next steps Addiction ontology Model building system Measures database Study reporting tool Open Science Framework Study registration Competing interest reporting