Maximizing the Validity of Interviewer–Collected Self-Report Data: A Quality Assurance Model in Action with the GAIN Janet C. Titus, Ph.D. Michelle K.

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

Evidence Based Practices Lars Olsen, Director of Treatment and Intervention Programs Maine Department of Corrections September 4, 2008.
Intervening with Adolescent Substance User: What do we know so far about and where do we go from here Michael Dennis, Ph.D. Chestnut Health Systems, Normal,
Chestnut Health Systems Bloomington-Normal, IL
Advances in Adolescent Substance Abuse Treatment and Research Michael Dennis, Ph.D. Chestnut Health Systems, Bloomington, IL Part of the continuing education.
Conceptual Feedback Threading Staff Development. Goals of Presentation What is Conceptualized Feedback? How is it used to thread the development of staff?
Crime, Violence, and Managing Client and Public Safety Michael L. Dennis, Ph.D., Chestnut Health Systems, Bloomington, IL Presentation at NEW DIRECTIONS.
The GAIN-Q (GQ): Development and Validation of a Substance Abuse and Mental Health Brief Assessment Janet C. Titus, Ph.D. Michael L. Dennis, Ph.D. Lighthouse.
BSBINM501A part 2 Trainer: Kevin Chiang
Using Data to Inform Practice Michael L. Dennis, Ph.D. Chestnut Health Systems, Normal IL Presentation at SAMHSA/CSAT Satellite Session, “ Implementing.
The Practice of Evidence Based Practice … or Can You Finish What You Started? Ron Van Treuren, Ph.D. Seven Counties Services, Inc. Louisville, KY.
Implementation of MST in Norway Iceland June 2008 Bernadette Christensen Clinical Director of the Youth Department Anne Cathrine Strütt MST Consultant.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
WMO Competency Standards: Development and Implementation Status
Teacher Evaluation Model
Welcome to “Billing for Consumer Centered Family Consultation in PROS” Webinar Hosted by: The Family Institute for Education, Practice & Research & The.
Trauma Issues with Specific Populations: Adolescents & Transition Age Youth OVERVIEW Michael Dennis, Ph.D. and Megan Catlin, M.S. Chestnut Health Systems,
Family Resource Center Association January 2015 Quarterly Meeting.
CfE Higher Physical Education
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Competitive Grant Program: Year 2 Meeting 2. SPECIAL DIABETES PROGRAM FOR INDIANS Competitive Grant Program: Year 2 Meeting 2 Data Quality Assurance Luohua.
Motivational Interviewing to Improve Treatment Engagement and Outcome* The effect of one session on retention Research findings from the NIDA Clinical.
Summer Camp: Duty of Care as a 4-H Staff Member Connie Coutellier, consultant, author, trainer and member of the 4-H State Camp Advisory Committee.
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
CHAPTER 5 Infrastructure Components PART I. 2 ESGD5125 SEM II 2009/2010 Dr. Samy Abu Naser 2 Learning Objectives: To discuss: The need for SQA procedures.
Test Organization and Management
ACE Personal Trainer Manual 5th Edition
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
AOD Use and Mental Health Disparities during Pregnancy and Postpartum Victoria H. Coleman, Ph.D. & Michael L. Dennis, Ph.D. Chestnut Health Systems, Bloomington,
Implementation of the Essential Standards The Australian Quality Framework (AQTF) is the national set of standards which assures nationally consistent,
1 CT DDS Quality Service Review Connecticut Community Providers Association Presented by Fred Balicki, DDS Quality Management Services May 27, 2008.
Program Fidelity Influencing Training Program Functioning and Effectiveness Cheryl J. Woods, CSW.
Heidi Erstad and Peg Mazeika, Technical Assistance Coordinators Wisconsin RtI Center Bridget Blask, Rachel Blum, and Leslie Connors Franklin Elementary.
Assessing Program Quality with the Autism Program Environment Rating Scale.
University of Leeds Ethnicity and Cultural Diversity Network The Globe Centre, Accrington 22 nd September 2005.
The Jordan Performance Appraisal System (JPAS) is designed to help educators in their continuing efforts to provide high quality instruction to all students.
MIA: STEP Toolkit Overview. NIDA-SAMHSA Blending Initiative 2 What is an MI Assessment?  Use of client-centered MI style  MI strategies that can be.
The Importance of Addressing the Affective Domain in Child Welfare Training Maureen Braun Scalera MSW, LCSW NSDTA Presentation
Unpacking and Implementing Training Packages Linda Hopkins.
SCREENING BRIEF INTERVENTION AND REFERRAL TO TREATMENT (SBIRT) 1.
A Quick Overview of Evidence-Based Practice Manuals The Addiction Technology Transfer Center Network Funded by Substance Abuse and Mental Health Services.
Performance and Development Teacher Librarian Network
Survey Methodology Survey Instruments (1) EPID 626 Lecture 7.
Quality Assuring Deliverers of Education and Training for the Nuclear Sector Jo Tipa Operations Director National Skills Academy for Nuclear.
PLCS & THE CONNECTION TO RESPONSE TO INTERVENTION Essentials for Administrators Sept. 27, 2012.
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Adult Education Assessment Policy Effective July 1 st, 2011.
Partnership Health: Evaluation and possibilities for an adapted structure Agenda item 11 Madhavi Bajekal, ONS (UK) PH coordinator Directors of Social Statistics.
October 15, 2015 Peter F. Luongo, Ph.D..  Alcohol misuse or abuse often goes undetected with a majority of clinicians citing lack of confidence in alcohol.
Implementation and Sustainability in the US National EBP Project Gary R. Bond Dartmouth Psychiatric Research Center Lebanon, NH, USA May 27, 2014 CORE.
Inter-American Development Bank BIMILACI 2007 QUALITY PROCUREMENT Third Party Review May 2007 Project Procurement Division.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Research Methods for Business Students
Chapter 33 Introduction to the Nursing Process
Data Collection Interview
Are Government Alliances a Threat to Workplace Safety
Iowa Teaching Standards & Criteria
Administering Behavioral CRFs
Assuring the Quality of your COSF Data
Evidence-Based Intervention Practices
Competency Based Training Delivery – is a kind of delivery where students undergo training at their own pace.
Process Evaluation the implementation phase
Developing Action Plans
Assessing educational/training competencies of trainers of trainers
Assuring the Quality of your COSF Data
Presentation transcript:

Maximizing the Validity of Interviewer–Collected Self-Report Data: A Quality Assurance Model in Action with the GAIN Janet C. Titus, Ph.D. Michelle K. White, M.A. Michael L. Dennis, Ph.D. Lighthouse Institute Chestnut Health Systems

Abstract Conclusions drawn from scientific studies are only as solid as the quality of the data on which they are based. In interviewer-administered assessments, one source of variation that impacts the quality - and thus validity - of the data is the quality of the assessment administration. Interviewers misunderstandings about the meanings of items, inaccuracies in recording, and lack of clarification of ambiguous responses are only a few of the difficulties that contribute to deterioration of validity. This is especially an issue in multi-site studies where site differences in interviewer training and supervision compound negative influences on the quality of the data. To address these problems in our studies, we have developed a quality assurance model organized around four core areas of an assessment administration: Documentation, Instructions, Items, and Engagement.

Abstract, continued Each core area contains a set of guidelines against which the quality of the administration is evaluated, and certification in assessment administration is earned when the interviewer demonstrates mastery in all four areas. Although it was developed for use with the family of instruments we use in our treatment studies -- the Global Appraisal of Individual Needs -- the model can easily be adapted to fit virtually any semi-structured, interviewer-administered data gathering instrument. Our quality assurance model has been successfully implemented in over 100 research and clinical sites across the U.S. Several hundred staff have been trained in the model and close to 200 have been certified in assessment administration. (Supported by CSAT contract )

Quality Assurance in Assessment Administration Quality assurance (QA for short) is a circular process consisting of: the monitoring of an interviewers skills at administering an assessment protocol, and the provision of evaluative feedback.

Monitoring can be done live or via audiotapes Feedback can be in person or in writing Once the interviewers skills reach a predetermined level of competence, the interviewer is certified in the assessment administration. QA can continue post-certification to monitor ongoing adherence to protocol.

Four Core Areas of Assessment QA Documentation accuracy and completeness of recording responses and administrative info Instructions accuracy and clarity of explanations, directions, and transitional statements Items delivery & clarification of the items Engagement quality of the interaction between the interviewer and client

Criteria for Assessing the Quality of an Administration Most of the following criteria under each core area are generic and can be applied to any assessment. Some criteria will be tailored to your specific assessment so as to account for sections not typical to most instruments. Definitions of the criteria for evaluating QA of the GAIN are in the handout for this poster or in Chapter 4 of the GAIN manual (

~ Documentation ~ (* - GAIN-specific) Cover page (front & back)* Check for Cognitive Impairment* General Directions, Literacy and Initial Administration Questions* Time to complete* Urgency & Denial-Misrepresentation* Administration Ratings* Documentation of participant answers

~ Instructions ~ (* - GAIN-specific) Introduction to assessment Check for Cognitive Impairment* Timeline* Additional Instructions for oral/self administration* Introduction of scales/Use of transitional statements Use of the cards & defining of response choices Repeating response choices when necessary Handling of participant questions about instructions

~ Items ~ (* - GAIN-specific) Item order & skips Word order Use of stems & time frames Use of parenthetical statements* Clarification of clients responses for coding Appropriate handling of client-initiated questions Responsiveness to apparent misunderstandings, inattentiveness, & inconsistencies

~ Engagement ~ Flow of the interview Appropriate voice articulation and inflection Use of encouraging or motivational statements Sensitivity to clients needs Rapport

Rating the Quality of an Assessment Administration Performance in each of the four core areas is assessed on a 4 point scale: Excellent Sufficient Minor Problems Problems Definitions of each scale value are in the QA chapter in the GAIN manual (

Guidelines for Preparing Feedback Feedback should be balanced -- it contains both things done well and things to improve. Feedback should be specific and behavioral.

Certification in GAIN Administration A QA reviewer evaluates the administration using the hardcopy assessment and an audiotape of the session (can use live monitoring and/or oral feedback). To be certified in GAIN administration, each core area needs to earn a rating of Sufficient or better. This usually happens within four monitored assessments.

We Use a Two-Tiered QA Model Train the Trainer – A certified QA reviewer oversees the certification process of a research or clinical site trainer who is in charge of assessment training. Trainer trains research staff - Once certified in administration and the provision of QA feedback, the trainer oversees the certification process and ongoing quality assurance monitoring of the staff.

Current Status Over 400 users have been trained to administer the GAIN. Close to 200 research and clinical staff are certified in GAIN administration. Close to 50 research and clinical staff are certified to train their own staff and provide QA feedback. About 15 QA reviewers currently review tapes. In the first quarter of 2004, our certification program reviewed an average of 70 GAIN tapes per month.

Next Steps Efforts are currently underway to analyze the effects of the QA protocol on the quality of data. We hypothesize the protocol will positively impact validity by producing… Fewer inconsistencies across items Greater internal consistency on scales Less missing data Shorter duration of interviews

Further Information & Acknowledgement For further information contact: Ms. Michelle White, Chestnut Health Systems, 720 W. Chestnut St., Bloomington, IL This poster is at The development of the GAIN QA model was supported by the Center for Substance Abuse Treatment (CSAT) through the Cannabis Youth Treatment study [5 UR4 TI11321].