Jennifer S. Funderburk, PhD

Slides:



Advertisements
Similar presentations
Behavioral Health Assessment in Integrated Primary Care: Conventions, Alternatives, and Mini International Neuropsychiatric Interview David R.M. Trotter,
Advertisements

Online Program Behavioral Health Internship C.R. Macchi, PhD Clinical Assistant Professor Internship Coordinator Doctor of Behavioral Health Program Arizona.
What Do I Do with this ? Healthcare Innovations Using a Relational Lens Tai J. Mendenhall, Ph.D., LMFT Assistant Professor, University of Minnesota Jennifer.
Integrating Behavioral Health into Wellness Visits in Pediatric Primary Care Jean Cobb, Ph.D. J. David Bull, Psy.D. Behavioral Health Consultants, Cherokee.
Cost Assessment of Collaborative Healthcare
Succeeding not seceding: The work of the Texas legislative workgroup on integrated healthcare Mary Lehman Held, L.C.S.W. Lynda E. Frost, J.D., Ph.D. Katherine.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Quantifying and Tracking Productivity for Behavioral Health Clinicians in a Primary Care Practice Joni Haley, MS Bill Gunn, Ph.D. Aimee Valeras, Ph.D.,
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
IPods in the Exam Room: A Pilot Study and a Discussion of Technology’s Role in Patient-Centered Care and the Treatment of Chronic Illness Danielle King,
Reporting and Using Evaluation Results Presented on 6/18/15.
Integration in Practice; Tracking the Transformation Perry Dickinson, MD Stephanie Kirchner, MSPH, RD Kyle Knierim, MD Collaborative Family Healthcare.
Symptom Presentation and Intervention Delivery by Veterans Administration (VA) and US Air Force (USAF) Behavioral Health Providers in a Primary Care Behavioral.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Title of Presentation Speaker Names, Credentials, Full Title Collaborative Family Healthcare Association 12 th Annual Conference October 21-23, 2010 Louisville,
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
Mary T. Kelleher, MS Faculty, Chicago Center for Family Health Tai J. Mendenhall, PhD Asst. Professor, Dept. of Family Social Science, University of Minnesota.
Treating Chronic Pain in Adolescents Amanda Bye, PsyD, Behavioral Medicine Specialist Collaborative Family Healthcare Association 15 th Annual Conference.
“The Effect of Patient Complexity on Treatment Outcomes for Patients Enrolled in an Integrated Depression Treatment Program- a Pilot Study” Ryan Miller,
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Programme Information Incredible Years (IY)Triple P (TP) – Level 4 GroupPromoting Alternative Thinking Strategies (PATHS) IY consists of 12 weekly (2-hour)
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
What’s Next? Advancing Healthcare from Provider-Centered to Patient- Centered to Family-Centered Kaitlin Leckie, MS Medical Family Therapy Fellow St Mary’s.
Evidence-Based Psychotherapies for Managing PTSD in the Primary Care Setting Kyle Possemato, Ph.D. Clinical Research Psychologist Collaborative Family.
Making It Work: Integrated Care from Start to Finish (571082) Jeri Turgesen, PsyD, Behavioral Health Consultant Providence Medical Group Laura Fisk, PsyD,
Title of Presentation Speaker Names, Credentials, Full Title Collaborative Family Healthcare Association 17 th Annual Conference October 15-17, 2015 Portland,
Multi-sector Policy Recommendations to Create a Culture of Whole Person Health: Results from a Multi-method Investigation Emma C. Gilchrist, MPH Program.
Medical Informatics : Moving the Tipping Point of Behavioral Health Integration Susan D. Wiley, MD Vice Chairman, Dept. Psychiatry Maryanne Peifer, MD,
Implementing Integrated Healthcare in Community Settings: Factors to Consider in Designing and Evaluating Programs Toni Watt, PhD, Associate Professor.
Making It Work: Integrated Care from Start to Finish (571082) Jeri Turgesen, PsyD, Behavioral Health Consultant, Providence Medical Group Laura Fisk, PsyD,
WILLIAM GUNN, PH.D. -- DIRECTOR OF PRIMARY CARE BEHAVIORAL HEALTH, NH-DARTMOUTH FAMILY PRACTICE RESIDENCY PROGRAM AT CONCORD HOSPITAL, CONCORD, NH AND.
Title of Presentation Speaker Names, Credentials, Full Title Collaborative Family Healthcare Association 13 th Annual Conference October 27-29, 2011 Philadelphia,
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Developing a proposal Dónal O’Mathúna, PhD Senior Lecturer in Ethics, Decision-Making & Evidence
Welcome! These slides are designed to help you think through presenting your evaluation planning and results. Feel free to pick and choose the slides that.
HCS 465 OUTLET Experience Tradition /hcs465outlet.com FOR MORE CLASSES VISIT
Training the Next Generation: Growing a Strong and Ready Workforce
Incorporating Evaluation into a Clinical Project
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Session # D7a Behavioral Health Screening in Primary Care: Is There an Alternative Measure? Brian DeSantis, PsyD, ABPP VP, Behavioral Health Peak Vista.
Preventing HCAI’s through an education programme for nurses
Evidence Based Practice In the Community Sector
DATA COLLECTION METHODS IN NURSING RESEARCH
Designing Effective Evaluation Strategies for Outreach Programs
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
The assessment process For Administrative units
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Training Trainers and Educators Unit 8 – How to Evaluate
Session # C1B Improving Patients’ Physical and Mental Wellbeing: A Shared Medical Appointment Targeting Type 2 Diabetes Ruth Nutting, MA, LCMFT ~ Coordinator.
Fundamentals of Monitoring and Evaluation
MUHC Innovation Model.
Session # D4 You’re Hired! Now What? Recommendations for ECPS Beginning Integrated Care Laura E Sudano, PhD, LMFT, Director of Behavioral Sciences, Wake.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Challenges in Evaluating Basic Science Investments: a Funder’s Perspective Julia Klebanov.
Jennifer S. Funderburk, PhD
Session # F8 How to Design an Integrated Behavioral Health Care Training and Evaluation Protocol  Period F: Saturday, October 15, 2016 – 10:45 to 11:30.
MeOTa fall conference October 22, 2016
Training Trainers and Educators Unit 8 – How to Evaluate
Speaker Names, Credentials, Full Title
NU3S01 Research for Nursing Practice
Striving for Model Fidelity in the PCBH Model: Metrics and Processes
Performance Improvement Projects: PIP Library
Speaker Names, Credentials, Full Title
The impact of small-group EBP education programme: barriers and facilitators for EBP allied health champions to share learning with peers.
Speaker Names, Credentials, Full Title
Presenter: Kate Bell, MA PIP Reviewer
Assessments and the Kirkpatrick Model
Seminar on the Evaluation of AUT STEM Programme
Presentation transcript:

Jennifer S. Funderburk, PhD Christina Studts, PhD William Lusenhop, PhD Mary Peterson, PhD Jennifer Wray, PhD Session A8, #8528969 Friday, 10:30-11:30am, October 14, 2016 Let’s Talk! Identifying Ways to Improve the Quality and Generalizability of Program Evaluation or Quality Improvement Data Please insert the assigned session number (track letter, period number), i.e., A2a Please insert the TITLE of your presentation. List EACH PRESENTER who will ATTEND the CFHA Conference to make this presentation. You may acknowledge other authors who are not attending the Conference in subsequent slides. CFHA 18th Annual Conference October 13-15, 2016  Charlotte, NC U.S.A. Collaborative Family Healthcare Association 12th Annual Conference

Faculty Disclosure The presenters of this session have NOT had any relevant financial relationships during the past 12 months. You must include ONE of the statements above for this session. CFHA requires that your presentation be FREE FROM COMMERCIAL BIAS. Educational materials that are a part of a continuing education activity such as slides, abstracts and handouts CANNOT contain any advertising or product‐group message. The content or format of a continuing education activity or its related materials must promote improvements or quality in health care and not a specific propriety business interest of a commercial interest. Presentations must give a balanced view of therapeutic options. Use of generic names will contribute to this impartiality. If the educational material or content includes trade names, where available trade names for products of multiple commercial entities should be used, not just trade names from a single commercial entity. Faculty must be responsible for the scientific integrity of their presentations. Any information regarding commercial products/services must be based on scientific (evidence‐based) methods generally accepted by the medical community. Collaborative Family Healthcare Association 12th Annual Conference

Learning Objectives At the conclusion of this session, the participant will be able to: Describe how to determine the quality and rigor of a quality improvement/program evaluation project  Identify general strategies for improving the quality and generalizability of quality improvement/program evaluation efforts  Describe how you can improve the quality and/or generalizability of a project that you are planning or currently implementing  Include the behavioral learning objectives you identified for this session Collaborative Family Healthcare Association 12th Annual Conference

Bibliography / Reference Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2015). Program evaluation: Alternative approaches and practical guidelines. Upper Saddle River, N.J: Pearson Education.  Green, L.W. & Glasgow, R.E. (2006). Evaluating the relevance, generalization, and applicability of research issues in external validation and translation methodology. Evaluation and the health professions, (29), 126-153.  Glasgow, R.E. & Emmons, K.M. (2007). How can we increase translation of research into practice? Types of evidence needed. Annual Review of Public Health, 28, 413-433.  Posavac, E.J. (2015). Program evaluation: Methods and case studies. New York; NY: Routledge  Bridget Gaglio, Jo Ann Shoup, and Russell E. Glasgow. The RE-AIM Framework: A Systematic Review of Use Over Time. American Journal of Public Health: June 2013, Vol. 103, No. 6, pp. e38-e46.  Continuing education approval now requires that each presentation include five references within the last 5 years. Please list at least FIVE (5) references for this presentation that are no older than 5 years. Without these references, your session may NOT be approved for CE credit. Collaborative Family Healthcare Association 12th Annual Conference

Learning Assessment A learning assessment is required for CE credit. A question and answer period will be conducted at the end of this presentation. Please incorporate audience interaction through a brief Question & Answer period during or at the conclusion of your presentation. This component MUST be done in lieu of a written pre- or post-test based on your learning objectives to satisfy accreditation requirements. Collaborative Family Healthcare Association 12th Annual Conference

Overview Conducting quality research and program evaluation in primary care settings (20 mins) Four groups (35 mins) I need help with strategic planning! I have need help with the methodology/design! How can I measure! I just need help (I have no idea where to start)! Large group discussion, summary, and resources (5 min)

Background: Program Evaluation New programs are continually being developed to improve the quality of care. It is important to understand the impact of these programs within real- world settings. Program Evaluation (PE) Critical tool by which clinicians/administrators can determine whether services are meeting their intended objectives. Designing systematic, scientifically rigorous program evaluations (PE) is one way to contribute to the significant need to build best practices and a stronger evidence-base for integrated healthcare. Collaborative Family Healthcare Association 12th Annual Conference

Program Evaluation in Integrated Healthcare PE is useful in examining An Example Extent a program is meeting intended objectives Program strengths/weaknesses Best ways to refine and improve performance Best approaches for implementing and sustaining a program Whether a program is worth investment of time, effort, and money Examine the feasibility and reactions of BHPs to receiving training for a brief alcohol intervention (BAI) via a webinar PC examples: -Satisfaction with the services provided by integrated behavioral health providers (BHPs) (Funderburk et al., 2012) -Whether meeting with a BHP results in clinical improvement (Sadock, et al., 2014) -Feasibility and reactions of BHP’s receiving a virtual training for a brief alcohol intervention. -Assess the reach, effectiveness, adoption, implementation, and maintenance of a depression monitoring service. Collaborative Family Healthcare Association 12th Annual Conference

Well-designed program evaluation Strategic planning Methodology and design considerations Measurement considerations Maintaining consistency Relying on researchers in academic settings to answer lingering questions is impractical not only because research funding is limited, but because it is important to understand the impact of these programs within real-world settings. Collaborative Family Healthcare Association 12th Annual Conference

Tips for achieving a well-designed program evaluation: Strategic planning Examine/define the “program” Use a conceptual framework or the underlying logic behind the program and what it is supposed to be doing (e.g., Kellogg Foundation logic model, RE-AIM) Consider purpose and audience of the PE Identify key question We looked at the satisfaction with the webinar and the knowledge learned, but we missed the opportunity to assess outcomes that may be relevant, if we had used this model—comfort and confidence when discussing this topic with patients Collaborative Family Healthcare Association 12th Annual Conference

Tips for achieving a well-designed program evaluation: Methodology & design BAI Example Having a comparison can increase confidence about effects of the program Include another condition: No-treatment control, waitlist, or usual care Compare to changes typically observed within the field Random assignment may not be feasible Block or cluster randomization (at clinic or provider level) Quasi-experimental study design Nonequivalent group design Multiple baseline design Regression-discontinuity design Did not assess the same outcomes pre- and post-training limiting our ability to draw conclusions regarding change Diverse sample of providers Qualitative portion: Open ended questions to get qualitative feedback about components of the webinar that BHPs liked/did not like Study design can significantly enhance the potential conclusions Another alternative is quasi-experimental pre/post designs. For example, a nonequivalent group design has groups assigned to conditions based on nature (those who received the program and those who did not) and comparisons based on information collected at two time points (Bryk & Weisberg, 1977). The multiple baseline design is another methodologically rigorous alternative to RCTs with several practical advantages, such as the ability to focus on one practice and collect comparison data from that same practice (see Hawkins, Sanson-Fisher, Shakeshaft, D’Este, & Green, 2007 for details). A regression-discontinuity design may also be an option for PE within integrated settings. This design allows patients who score higher on a specific cutoff (e.g., Patient Health Questionnaire-9 [PHQ-9] ≥ 10; Kroenke, Spitzer, & Williams, 2001) to be assigned to the program and those who score under the cutoff (PHQ-9 < 10) to be assigned to the comparison condition (Thistlethwaite & Campbell, 1960). Collaborative Family Healthcare Association 12th Annual Conference

Tips for achieving a well-designed program evaluation: Measurement Method of data collection What type of information to collect and how? Consider strengths and weaknesses of each approach Feasibility Quality of measurement Consider reliability, validity, and meaningful change Example: PHQ-9 Supplementing self-report data with objective data Collaborative Family Healthcare Association 12th Annual Conference

Tips for achieving a well-designed program evaluation: Maintaining consistency in the Program Training is key Orient clinic staff to the measures Ensure program is being delivered as intended Training materials should be overly detailed to ensure that training is delivered consistently Monitor program adherence over time Audiotapes or videotapes, fidelity checklists, or direct observations Pick your favorite pic Collaborative Family Healthcare Association 12th Annual Conference

Applying these tips to your PE Break into four groups: I need help with strategic planning! I have need help with the methodology/design! How can I measure! I just need help (I have no idea where to start)! Collaborative Family Healthcare Association 12th Annual Conference

Thank you! Session Evaluation Please complete and return the evaluation form to the classroom monitor before leaving this session. Thank you! This should be the last slide of your presentation Collaborative Family Healthcare Association 12th Annual Conference