Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.

Slides:



Advertisements
Similar presentations
How to write a study protocol Hanne-Merete Eriksen (based on Epiet 2004)
Advertisements

Survey design. What is a survey?? Asking questions – questionnaires Finding out things about people Simple things – lots of people What things? What people?
Agency for Healthcare Research and Quality (AHRQ)
Study Size Planning for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
Presentation of BE data in a product dossier Drs. Jan Welink Training workshop: Training of BE assessors, Kiev, October 2009.
 Is extremely important  Need to use specific methods to identify and define target behavior  Also need to identify relevant factors that may inform.
Alternative Assesment There is no single definition of ‘alternative assessment’ in the relevant literature. For some educators, alternative assessment.
Reading the Dental Literature
Group CLS Chapters 4 & 5. Course Competencies Applying group dynamics and processes Evaluating ethical and professional guidelines for professional.
Elements of a clinical trial research protocol
Quantitative Research Deals with quantities and relationships between attributes (variables). Involves the collection and analysis of highly structured.
Writing a Research Protocol Michael Aronica MD Program Director Internal Medicine-Pediatrics.
ALEC 604: Writing for Professional Publication Week 7: Methodology.
Materials and methods Sadeghi Ramin, MD Nuclear Medicine Research Center, Mashhad University of Medical Sciences.
Clinical Trials Hanyan Yang
Evaluation of Health Promotion CS 652 Sarah N. Keller.
CORRELATIO NAL RESEARCH METHOD. The researcher wanted to determine if there is a significant relationship between the nursing personnel characteristics.
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
What is suicide? Definitions Suicidal ideation Suicidal thoughts May be passing or serious, may or may not be accompanied by intent Suicide attempt "self-influcted,
Statistical Issues in Data Collection and Study Design For Community Programs and Research October 11, 2001 Elizabeth Garrett Division of Biostatistics.
Chapter 4 Research Methods
STUDY PLANNING & DESIGN TO ENHANCE TRANSLATION OF HEALTH BEHAVIOR RESEARCH Lisa Klesges, Russell Glasgow, Paul Estabrooks, David Dzewaltowski, Sheana Bull.
Lecture 8 Objective 20. Describe the elements of design of observational studies: case reports/series.
Chapter 11 Research Methods in Behavior Modification.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
CHP400: Community Health Program - lI Mohamed M. B. Alnoor Research Methodology STUDY DESIGNS Observational / Analytical Studies Present: Disease Past:
Frequency and type of adverse events associated with treating women with trauma in community substance abuse treatment programs T. KIlleen 1, C. Brown.
Methodology Describe Context & setting Design Participants Sampling Power Analysi s Interventions Outcome (study variables) Data Collection Procedures.
Study Design. Study Designs Descriptive Studies Record events, observations or activities,documentaries No comparison group or intervention Describe.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Types of study designs Arash Najimi
Study design P.Olliaro Nov04. Study designs: observational vs. experimental studies What happened?  Case-control study What’s happening?  Cross-sectional.
Evaluating a Research Report
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Tips for Researchers on Completing the Data Analysis Section of the IRB Application Don Allensworth-Davies, MSc Statistical Manager, Data Coordinating.
BY THE NUMBERS: QUANTITATIVE DATA ANALYSIS Sheika Kendi Georgia State University.
Group Quantitative Designs First, let us consider how one chooses a design. There is no easy formula for choice of design. The choice of a design should.
Consumer behavior studies1 CONSUMER BEHAVIOR STUDIES STATISTICAL ISSUES Ralph B. D’Agostino, Sr. Boston University Harvard Clinical Research Institute.
CHP400: Community Health Program - lI Research Methodology STUDY DESIGNS Observational / Analytical Studies Present: Disease Past: Exposure Cross - section.
Successful Concepts Study Rationale Literature Review Study Design Rationale for Intervention Eligibility Criteria Endpoint Measurement Tools.
SS440 Seminar: Unit 4 Research in Psychopathology Dr. Angie Whalen Kaplan University 1.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
Potential Errors In Epidemiologic Studies Bias Dr. Sherine Shawky III.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Lecture 9: Analysis of intervention studies Randomized trial - categorical outcome Measures of risk: –incidence rate of an adverse event (death, etc) It.
SOCW 671 # 8 Single Subject/System Designs Intro to Sampling.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Program Evaluation Principles and Applications PAS 2010.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Slides to accompany Weathington, Cunningham & Pittenger (2010), Chapter 15: Single-Participant Experiments, Longitudinal Studies, and Quasi-Experimental.
Street Smart Implementation Webinar September 10, 2009 Cicatelli Associates Inc. Public Health Solutions.
Types of Studies. Aim of epidemiological studies To determine distribution of disease To examine determinants of a disease To judge whether a given exposure.
Research design By Dr.Ali Almesrawi asst. professor Ph.D.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
Chapter 11 Experimental Designs PowerPoint presentation developed by: Sarah E. Bledsoe & E. Roberto Orellana.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
CLINICAL PROTOCOL DEVELOPMENT
Chapter 12 Single-Case Evaluation Designs
Randomized Trials: A Brief Overview
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Developing HDFS Learning Goals
Chapter Eight: Quantitative Methods
BU Career Development Grant Writing Course- Session 3, Approach
Group Experimental Design
Evidence Based Practice
Presenter: Kate Bell, MA PIP Reviewer
Presentation transcript:

Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.

Objective : To review the design and methodological factors that impact the study of interventions for suicidal ideation and attempts Includes  Evaluate the impact of the intervention  Evaluate the intervention itself

Review of design considerations: Goals Design Sample Measures Procedures Data Analysis Treatment Evaluation

Goals Theory/ Rationale Meaningful Testable hypotheses Feasible

Design considerations Type of design Questions that can be answered Questions that cannot be answered Multi-method multi-trait approach Strengths and Limitations

Types of Design: Pre-post Control/Comparison Group  Randomized, stratified random, convenience Longitudinal  Prospective cohort design Epidemiological  Large scale cohort or case-control

Sample Considerations Who is the target of the intervention?  Patients All patients At-risk  Attempters, ideators

How is the sample selected? Identification of Sample:  Convenience vs. Random Criteria for inclusion and exclusion:  Recruitment and Screening Demographic considerations:  Age, sex, educational level Determination of Control or Comparison Group

How will the nature of the sample affect measurement and procedures?  Attainment of necessary sample size  Developmental level and language level  Potential burden/ load for participant  Representativeness and generalizability  Feasibility  Time, place, implementation, ability of participants, attrition

What needs to be measured? Outcome Confounders Mediators and Moderators Context

Administration Considerations Format  Face-to-face interview, self-report, telephone, computer Source of information  Self, other informant, records, epidemiological information Instrument for repeated measures  Same form, alternate forms

Outcome Measures Must: Measure the target of intervention Be standardized Be expected to change within the time frame Be Sensitive to change Be present in all groups Have a measurable effect size Have demonstrated reliability and validity Be feasible

Current Measures of Outcome Suicidal Ideation Suicide Attempts Completed Suicide Lethality of attempt # crisis calls Associated symptoms Adjunctive medications Hospitalization # referrals Social Skills

Procedural Considerations Intervention  Definition and manualization # sessions, length, medication dose  Expected outcomes relevant & measurable  Training & ongoing supervision  Maintenance of blind assessors  Implementation of intervention and fidelity  Adherence and attrition Interval of Measurement  One-shot, short-term, long-term

Recruitment Methods  Systematic, documented  Keeping people in the program Investigator’s Role  Avoid potential biases  Appropriate level of supervision Ethical Considerations  Confidentiality, identification of risk, intervention Feasibility

Data Analysis Goals  Efficacy/ Impact of Intervention  Program Evaluation

Considerations before conducting the study that impact data analysis: Specific, testable hypotheses with data analytic strategy established Power Analysis Potential confounders, mediators & moderators Type and nature of data Number of analyses Effect sizes and variability of measures Data reduction Managing and imputing missing data

Types of Analyses Univariate  T-tests, chi-square, ANOVA, Correlation, nonparametric Multivariate  Repeated Measures  Path Analysis  Multiple Regression techniques  Survival Analysis  Time series or trend analysis

Points to consider when analyzing: Know your data before any analyses Reliability is the upper limit of validity No variability means no finding Not everything is linear Build models based on univariate statistics- test with multivariate The analysis must fit the type of data With numerical data, continuous variables are more informative than categorical variables

Evaluating an intervention Feasibility Fidelity to intervention Reliability and validity of all measures Attrition Adherence Consumer satisfaction Negative Outcomes/ Adverse Events

Feasibility of the intervention Time Resources: staff, space, money, supplies Availability of participants Setting interest and amenability Implementation of intervention Assessment methods

Fidelity to Intervention Evaluation of training Ongoing training and reliability Ongoing monitoring of intervention Staff efficacy and satisfaction

Attrition Assess from recruitment to end of study Compare rate of attrition to typical rates Compare drop-outs to study completers on baseline, demographic and relevant variables

Reliability and Validity of Measures Assess all measures with all appropriate forms of reliability Test discriminant and convergent validity

Adherence Embed measures of adherence in intervention and assessments  Attendance  Questions about previous sessions  Test for medication or substances  Follow-up behavioral questions

Consumer Satisfaction Participants Staff Outside Informants  family members, service providers

Monitor Negative/ Adverse Events Document adverse events in a standard manner Anticipate potential adverse events and prepare assessment and monitoring tools Plan for suicidal risk

Special considerations for suicide research Need enough suicidal behavior to notice a difference  When is an intervention effective Reduction vs. Elimination Monitoring for safety in an potentially unsafe sample Decision about when a participant is exited from the study Intervention regarding suicidal behavior is an intervention that effects outcome behavior

Conclusion Many decisions to be made when designing a study Each decision affects the conduct of the study Method determines the conclusions that can be drawn from any one study Validity is accrued across studies