+ Using Client-Focused Research Methods to Improve Outcomes Saara T. Grizzell, Ph.D., CRC, LVRC & Julie F. Smart, Ph.D., CRC (ret), LPC, LVRC, ABDA, AAPC,

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

 Is extremely important  Need to use specific methods to identify and define target behavior  Also need to identify relevant factors that may inform.
Brian Hansen, BS Tyler Pedersen, PhD Tom Golightly, PhD John Okishi, PhD Counseling and Career Center Brigham Young University April, 2011.
Background: The low retention rates among African Americans in substance abuse treatment (Milligan et al., 2004) combined with the limited number of treatments.
What is Evaluation? David Dwayne Williams Brigham Young University
Using Live Supervision to Deliver Family Intervention Training Rick Allan and Anita Savage Grainge Footer.
1 The Child and Family Traumatic Stress Intervention A family based model for early intervention and secondary prevention Steven Berkowitz, M.D. Steven.
1 Introduction to ValueOptions ® On Track Outcomes Sonny Phipps, M.B.A. Program Manager, ValueOptions Jeb Brown, Ph.D. Director, Center for Clinical Informatics.
Chapter 2 Flashcards.
+ Evidence Based Practice University of Utah Presented by Will Backner December 2009 Training School Psychologists to be Experts in Evidence Based Practices.
Training in Instructional Consultation, Assessment & Teaming Todd A. Gravois, Ph.D. Edward Gickling, Ph.D. & Sylvia Rosenfield, Ph.D.
Measuring and reporting outcomes for your BTOP grant 1Measuring and Reporting Outcomes.
Recreational Therapy: An Introduction
YJB TOOLKITS: Disproportionality YJB owner: Sue Walker Dept: Performance May (2011) Version 1.0.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
 1. Methods of evaluation are thorough, feasible, and appropriate  2. Use of objective measures to produce quantitative and qualitative data  3. Methods.
RRTC-EBP-VR The Rehabilitation Research and Training Center on Effective Vocational Rehabilitation Service Delivery Practices (RRTC-EBP-VR) is established.
Using Assessment in Counseling
Frequency and type of adverse events associated with treating women with trauma in community substance abuse treatment programs T. KIlleen 1, C. Brown.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
STANDARDS FOR THE PRACTICE RECREATIONAL THERAPY (ATRA, REVISED 2013) HPR 453.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Human Capital Office Cultivating Talent, Diversity and Service Servicewide New Employee Ambassador Program NTEU Briefing.
District Results Module Preview This PowerPoint provides a sample of the District Results Module PowerPoint. The actual Overview PowerPoint is 59 slides.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Care Planning: Better health begins here... April 2013.
Specific Aims  Modify a previously used ACASI (audio computer assisted structured interview) assessment tool, the Sexual Behavior Inventory (SBI), for.
Tips for Researchers on Completing the Data Analysis Section of the IRB Application Don Allensworth-Davies, MSc Statistical Manager, Data Coordinating.
“The Effect of Patient Complexity on Treatment Outcomes for Patients Enrolled in an Integrated Depression Treatment Program- a Pilot Study” Ryan Miller,
Using Client Feedback to Build a Strong Therapeutic Relationship
Analysis of MDS Data Deborah J. Ossip-Klein, Ph.D. University of Rochester Medical Center 2005 NAQC Annual Membership Meeting Chicago.
Research Proposal John Miller Nicolette Edenburn Carolyn Cox.
Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Real-Time Monitoring of Psychotherapy Treatment Response: An Evidence- Based Practice Michael J. Lambert, Ph.D. Brigham Young University
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
DODAC and DSaRM Advisory Committee August 1, 2007 Pregnancy Registry and Root Cause Analysis Cynthia Kornegay, Ph.D. Division of Drug Risk Evaluation Office.
MEETING THE INDIVIDUAL NEEDS OF EACH STUDENT WITH A DISABILITY “LEAVE NO STUDENT BEHIND” Phoenix Job Corps Center Staff and Students National Health and.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
How To Design a Clinical Trial
BUMI-CBT กับการช่วยเหลือผู้ป่วย ให้เปลี่ยนแปลง พฤติกรรมดื่ม แอลกอฮอล์ ดรุณี ภู่ขาว (Bsc. Nursing, MS (Mental heath), MN, PhD Candidate, Department of Psychiatry,
1 Establishing Spanish- and English- Speaking CBT Groups for Depression in a Training Clinic Velma Barrios, Ph.D. Margareth Del Cid Ashley Elefant Palo.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
MAT-PDOA Program Evaluation Diana Seybolt, Ph.D. Karen McNamara, Ph.D. Systems Evaluation Center (SEC)
How to design and deliver a successful evaluation 19 th October 2015 Sarah Lynch, Senior Research Manager National Foundation for Educational Research.
Introduction to Research for Physical Therapy Students.
Basic Concepts of Outcome-Informed Practice (OIP).
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Relevance to Distance Education of the Experiential Learning Model William E. Garner, Rh.D., CRC, LPC 1.
Collaborating with and becoming accountable to our clients FEEDBACK INFORMED TREATMENT.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Unit 7 Research Designs. What is a Research Design?? Researcher’s strategy: Describes how the researcher(s) will answer their questions/test hypotheses.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
INTRODUCTION This is an overview of MRC Who the program is for
Stages of Research and Development
How To Design a Clinical Trial
Sofija Zagarins1, PhD, Garry Welch1, PhD, Jane Garb2, MS
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Fundamentals of a Vocational Assessment
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Introduction to Program Evaluation
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Routine Outcome Monitoring: the good, the bad, and the ugly
CONCLUSIONS AND DISCUSSION
Presenter: Kate Bell, MA PIP Reviewer
Toronto Child & Youth Advocacy Centre (CYAC)
Presentation transcript:

+ Using Client-Focused Research Methods to Improve Outcomes Saara T. Grizzell, Ph.D., CRC, LVRC & Julie F. Smart, Ph.D., CRC (ret), LPC, LVRC, ABDA, AAPC, CCFC, NCC Utah State University

+ Objectives Specify research methods/procedures List possible uses of client-focused research ID three considerations Conceptualize approaches

+ Overview Introduce client-focused research approach and how it can be used in program evaluation Research methods/procedures for tracking outcomes with the use of feedback (pilot study example) approaches for analyzing the data of client- focused interventions (pilot study example) what to consider when implementing successful client-focused interventions 3

+ Introduction

+ Background First introduced in 1996 by Howard, Moras, Brill, Martinovich, & Lutz Howard et al, 1996, argued that three questions were at the root of all intervention assessments. 5

+ Assessing an Intervention: Three Questions Does the intervention work under special, experimental conditions? Does the intervention work in practice? Does the intervention work for this particular consumer? (Howard, Moras, Brill, Martinovich, & Lutz, 1996) 6

+ Setting the Context: Program Evaluation Answers specific questions using a systematic method Types of Questions Formative and Summative May be conducted at several stages during a program’s lifecycle 7

+ Applying the Three Questions to Program Evaluation Does this program work in a special, experimental context (e.g. to meet the needs of individuals with certain types of disabilities, levels of income, certain ethnic groups, etc.)? Does this program work in general practice? Does the program work for this particular consumer? 8

+ Client-Focused Approach

+ Involves assessing the client’s progress on a weekly basis during the treatment course and providing weekly feedback to the counselor and the client about the client’s progress 10

+ Treatment-focused vs. Client- focused Research Concerned with establishing comparative and intervention efficacy Provides aggregated results over groups of participants Focuses on the group’s response to treatment Monitors individual progress over the course of treatment Provides feedback to the client and the clinician during the course of treatment Focuses on client- specific response to treatment Treatment-FocusedPatient or Client-Focused 11

+ Client-Focused Assessment Systems Although several systems have been developed, only two systems have demonstrated gains in randomized controlled trials (Duncan, 2013): The Outcome Questionnaire-45 (OQ-45; Lambert, Kahler, Harmon, Burlingame, & Shimokawa, 2013) Partners for Change Outcome Management System (PCOMS; Miller, Duncan, Sorrell, & Brown, 2005) 12

+ Client-Focused Research: Enhancing Outcomes Particularly helpful in decreasing deterioration rates Used in inpatient and outpatient settings Positively related to client outcomes Used as a quality assurance system in agency settings More recently used in a state VR setting (Hawkins et al, 2004; Lambert, Hansen, & Finch, 2001a; Lambert et al, 2002a; Shimokawa, Lambert, & Smart, 2010; Whipple, Lambert, Vermeersch, Smart, Nielsen, & Hawkins, 2003; ; Hawkins, Lambert, Vermeersch, Slade, & Tuttle, 2004) 13

+ Using Client-Focused Approach in a State VR Setting: Pilot Study Example

+ Study Purpose Examine the impact of providing treatment progress feedback to 30 individuals receiving services at a vocational rehabilitation agency 15

+ Study Design and Conditions Study Design: Repeated measures randomized wait- list control design with matching prior to randomization Feedback Condition (Fb): Group counseling with feedback Treatment as Usual (TAU): Group counseling but no feedback 16

+ OQ-45: Progress and Outcome Outcome Questionnaire-45 (OQ-45) designed to track progress during treatment and outcomes at termination Responses used to generate progress report Progress report acts as the feedback (Lambert, Kahler, Harmon, Burlingame, & Shimokawa, 2013) 17

+ Recruitment/Enrollment Identified clients from own client base who met criteria Informed potential participants of study Administered the GRQ, a screening tool that assesses readiness for group Contacted interested participants Enrolled participants Conducted initial interview Administered baseline QO CounselorsResearcher

+ Counseling Groups 5 counseling groups, 4 to 6 members each Conducted by a facilitator and co-facilitator Group facilitator or co-facilitator served as participant’s individual VR counselor Duration and frequency: 1.5 hours per session, once weekly for 10 weeks 19

+ Feedback Provision Feedback provided to counselors regarding progress of clients in Fb condition One page report about client treatment response One page report provided to participants in the Fb condition A graph charting progress and a narrative about the rate of progress Feedback to CounselorsFeedback to Participants 20

+ End of Study Researcher asked all participants to rate progress toward employment and group counseling goals 21

+ Data Analyses The independent variables of time, condition, and office were included in a Linear Mixed Effects Model analysis for each of the dependent variables (symptom distress, interpersonal relationships, and social role performance) 22

+ Data Analyses Chi square analysis was used to analyze potential differences in categorical demographic variables between conditions McNemar test for correlated proportions was used to determine employment outcomes Independent samples t-tests were used to analyze the employment progress ratings 23

+ Results The social role performance and mental health functioning scores in both conditions showed significant improvement. Ratings for employment progress were statistically significant, f(251)=2.77, p=.006, two tailed Employment outcomes were significant for both conditions (p=0.012), and close to significance for the treatment condition (p=0.063). 24

+ Results Interestingly, participants in the feedback condition who received social security or subsistence benefits made the most steady and consistent progress with interpersonal relationships (p=.025), social role performance (p=.021), and mental health functioning (p=.028). 25

+ Using Client-Focused Methods in Program Evaluation

+ Tasks of the Evaluator Verify that implemented programs provide needed services Determine which programs produce the most favorable outcomes Select the programs that offer the most needed types of services Determine if program interventions are the cause of the desired changes Provide information to maintain and improve quality (Posavac & Carey, 2007)

+ Verify that Implemented Programs Provide Needed Services “The most fundamental problem with programs is that some are either never implemented as planned or are implemented in such a diluted fashion that people in need receive no or minimal benefit.” (p. 4, Posavac & Carey, 2007) 28

+ Using Client-Focused Methods to Conduct Program Evaluations Assessing the need for the program Examining the process of meeting consumer needs Assessing what the program has actually achieved Evaluating program outcomes 29

+ Considerations for Implementing a Client-Focused Approach On-going Monitoring: Attrition is a big concern o Start with a bigger N o Consider short durations of monitoring o Make agreements about the logistics of routine monitoring (e.g. at office, on-line, weekly, every other week, etc.) Delivering Feedback: Consider agency and consumer resources (on-line, USPS, Log in) Implementing Feedback: Suggest a routine time for consumer and provider to meet to discuss the feedback 30

+ Contact Information Saara Grizzell l.usu.edu