Improving the Jobs of Direct Care Workers in Long Term Care: Findings from the Better Jobs Better Care Demonstration Penn State Evaluation Team: Peter.

Slides:



Advertisements
Similar presentations
Evaluating and Institutionalizing
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
An Experimental Study of Child Welfare Worker Turnover Nancy S. Dickinson, University of Maryland John S. Painter
Introducing NC NOVA: New Organizational Vision Award A Voluntary Special Licensing Award for Home Care Agencies, Adult Care Homes & Nursing Facilities.
In the name of Allah. Development and psychometric Testing of a new Instrument to Measure Affecting Factors on Women’s Behaviors to Breast Cancer Prevention:
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 3 Describing Data Using Numerical Measures.
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Multivariate Methods EPSY 5245 Michael C. Rodriguez.
TEAM MORALE Team Assignment 12 SOFTWARE MEASUREMENT & ANALYSIS K15T2-Team 21.
2010 Annual Employee Survey Results
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
“Teachers do make a difference…” - Jere Brophy, 1979.
The Training Needs of University Sustainability Managers David DuBois, PhD The Social Design Group Cathy DuBois, PhD Kent State University Presented at.
The Vocabulary of Research. What is Credibility? A researcher’s ability to demonstrate that the study is accurate based on the way the study was conducted.
An Update on Florida’s Charter Schools Program Grant: CAPES External Evaluation 2014 Florida Charter Schools Conference: Sharing Responsibility November.
Student Engagement Survey Results and Analysis June 2011.
South Carolina TAP: A National Leader in Outcomes Based Teacher Incentive Programs.
Interim Update: Preliminary Analyses of Excursions in the A.R.M. Loxahatchee National Wildlife Refuge August 18, 2009 Prepared by SFWMD and FDEP as part.
Classification & Compensation Study Outside firm (BCC) was hired to perform: Classification Study Internal Equity Pay equity compliance Study.
Add presentation title to master slide | 1 New inspection methodology from September 2009 NATSPEC conference Lorna Fitzjohn May 2009.
Innovative Teaching and Learning MOVING FROM THEORY TO ACTION 6 th June 2012.
TI Three Year Hiring Results. Three year TI Progress By Level School Year New Hire Teachers Elementary Schools Average.
Identifying and Assessing Learning Outcomes for Professional Development Programming Diane E. Waryas, Ph.D. Kim E. VanDerLinden, Ph.D.
Delaware Birth to Three Early Intervention System Evaluation: Child Outcomes July 15, 2004 Conference Call Series: Measuring Child Outcomes “Examples of.
Quality Assurance: Looking for Quality Data 1 I know it is in here somewhere Presented by The Early Childhood Outcomes Center Revised January 2013.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
For over 70 years, the Texas Commission for the Blind has assisted blind and visually impaired Texans in achieving independent and successful lives through.
Developing a Tool to Measure Health Worker Motivation in District Hospitals in Kenya Patrick Mbindyo, Duane Blaauw, Lucy Gilson, Mike English.
Loneliness-Mediated Long-term Associations Between Preoccupied Attachment and General Health Joseph S. Tan, Jessica Kansky, Elenda T. Hessel, Megan M.
Does Reading First Work? Feng-Yi Hung, Ph.D. Director of Assessment and Program Evaluation Clover Park School District.
We help to improve social care standards June Kathryn Chamberlain Area Officer Eastern.
Session Summary Improving the Jobs of Direct Care Workers in Long Term Care: Findings from the Better Jobs Better Care Demonstration.
Why Do State and Federal Programs Require a Needs Assessment?
Understanding and Using the Results from the NCSEAM Family Survey Batya Elbaum, Ph.D. NCSEAM Measuring Child and Family Outcomes NECTAC National TA Meeting.
1 Universal Pre-Kindergarten (UPK) Scope of FY08 Evaluation Activities December 11, 2007.
What you need to know about changes in state requirements for Teval plans.
1 Did Better Jobs Better Care Improve Direct Care Worker Job Outcomes? Brigitt Heier June 8, 2008 Presented at the annual meeting of AcademyHealth. The.
UN Delivering as One: Capacity Assessment for Rwanda External Stakeholder Perspectives on UN in Rwanda March 2008 NOTE: This tool/guidance has been developed.
1 Did Better Jobs Better Care Change Management Practices and Processes? Amy Stott June 8, 2008 Presented at the 25 th AcademyHealth annual meeting. The.
Chapter 2 Describing Distributions with Numbers. Numerical Summaries u Center of the data –mean –median u Variation –range –quartiles (interquartile range)
Did Better Jobs Better Care reduce direct care worker turnover? Monika Setia June 8, 2008 Presented at the Academy Health’s 25 th Annual Research Meeting.
1 The Effect of Primary Health Care Orientation on Chronic Illness Care Management Julie Schmittdiel, Ph.D., Stephen M. Shortell, Ph.D., Thomas Rundall,
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Brenda J. Stutsky RN, PhD Development and Testing of a Conceptual Framework for Interprofessional Collaborative Practice.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
Chapter 6: Analyzing and Interpreting Quantitative Data
BPS - 5th Ed. Chapter 21 Describing Distributions with Numbers.
ESEA, TAP, and Charter handouts-- 3 per page with notes and cover of one page.
Limit collection of categorical data Age – – – – – & Above Income ,000 10,001 – 25,000 25,001 – 35,000.
1 Early Childhood Assessment and Accountability: Creating a Meaningful System.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
Goal Attainment Scales as a way to Measure Progress Amy Gaumer Erickson & Monica Ballay December 3, 2012.
Chapter 5 Describing Distributions Numerically Describing a Quantitative Variable using Percentiles Percentile –A given percent of the observations are.
Implementation and Sustainability in the US National EBP Project Gary R. Bond Dartmouth Psychiatric Research Center Lebanon, NH, USA May 27, 2014 CORE.
Classroom Self-Assessment Using the ICP: Evaluation of the Process Award of Excellence in Inclusion of Children with Special Needs ExceleRate Illinois.
Prepared for the Office of Head Start by ICF International School Readiness Goals: Using Data to Support Child Outcomes.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
1 Innovative Teaching and Learning (ITL) Research Corinne Singleton SRI International.
Using the Site Capacity Assessment Tool to Measure Sustainability of Antiretroviral Programs in Tanzania Robert Mgeni Mwikali Kioko LEAD Program 3 rd National.
Statistics & Evidence-Based Practice
Job Analysis Chapter-4
Georgia’s Pre-K Summer Transition Program
RAPID RESPONSE program
Science of Psychology AP Psychology
Our new quality framework and methodology:
EPSY 5245 EPSY 5245 Michael C. Rodriguez
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Presentation transcript:

Improving the Jobs of Direct Care Workers in Long Term Care: Findings from the Better Jobs Better Care Demonstration Penn State Evaluation Team: Peter Kemper, Diane Brannon, Brigitt Heier, Amy Stott, Monika Setia, Joseph Vasey, Jungyoon Kim, and Candy Warner June 8, 2008 Presented at the annual meeting of AcademyHealth. We thank The Atlantic Philanthropies, The Robert Wood Johnson Foundation, and the Office of the Assistant Secretary for Planning and Evaluation for funding. (Contract No. HHSP EC)

BJBC Demonstration Goal: Improve direct care workers’ job quality and reduce turnover Direct care workers: Provide help with personal care Interventions: BJBC training and technical assistance intended to improve management practices Where: – Five state projects – 124 providers – Skilled nursing, assisted living, home care, adult day service

BJBC’s Intended Effects: Basic Framework BJBC Interventions Providers Implementation Management Practices Direct Care Worker Job outcomes Turnover

InterventionIANCORPAVT Top Management Training √ √ Supervisor Training√√√ Team Building√√√√ DCW Career Development√√√√ Caregiving Skill Development√√√ BJBC Interventions: Technical Assistance and Training

Approach to Evaluation Evaluation not designed with a control group – Before-after evaluation design and data – Sought methods of strengthening design “Let a thousand flowers bloom” demonstration  interventions not standardized or known – Measured range of management practices – Developed measures of extent of implementation

Methods for Estimating Effects Basic approach – Before-after comparison of means – Post-intervention trends compared with national trends Difference-in-difference: Compare changes in: – States with and without specific interventions – Providers that did and did not implement

Analyses Presented BJBC Interventions Providers 1.Implementation 2.Management Practices Direct Care Workers 3.Job outcomes 4.Turnover

Methods Used in Impact Analyses Before-After or Trend States with and without Intervention Providers that did and did not implement Management Practices √√ Job Outcomes √√ Turnover √√

Data Telephone interviews with project managers Survey of clinical managers Survey of frontline supervisors Survey of direct care workers Hiring and termination information system

Data Used in Analyses Practice Manager Clinical Manager SupervisorDirect Care Worker Hiring and Termination Implementation √√√ Management Practices √ Job Outcomes √ Turnover √

Measuring Implementation in Formative Evaluations: Using Data from Multiple Perspectives Peter Kemper Brigitt Heier Joe Vasey Diane Brannon June 8, 2008 Presented at the annual meeting of AcademyHealth. The authors are grateful for support from The Atlantic Philanthropies, The Robert Wood Johnson Foundation, and the Office of the Assistant Secretary for Planning and Evaluation (Contract No. HHSP EC)

Motivation Variation in implementation observed in early site visits Mid-course correction: Add implementation measures Goal: Develop a summary index of extent of provider implementation for use in impact analysis

Measures from Three Perspectives Practice Manager (state project level) Clinical Manager (provider level) Frontline Supervisors (provider level)

Practice Manager Perspective “Make a mark on the scale that best describes this provider’s current degree of implementation” – 0 - Implementation of interventions has not yet started – Interventions are fully implemented and sustainable

Clinical Manager Perspective “Indicate the level of progress your organization has made in implementing the most important intervention” – 0 - Implementation of intervention has not yet started – 10 - The intervention is fully implemented and sustainable “The programs that are part of BJBC have been well executed in your organization” – Five point scale from strongly disagree to strongly agree

Frontline Supervisor Perspective “The programs that are part of BJBC have been well executed in your organization” – Five point scale from strongly disagree to strongly agree Averaged across supervisors in each provider

Methods Exploratory factor analysis – Principal components extraction method – Extracted component with an eigenvalue greater than 1 – Included items if the factor loading was.6 or greater Imputed values when one or two items missing using maximum likelihood procedure Sample size: 92 providers

Factor Loadings MeasureLoading Implementation score (Practice Manager).70 Implementation score (Clinical Manager).76 Execution of BJBC programs (Clinical Manager).80 Execution of BJBC programs (Frontline Supervisor).76 Factor has an eigenvalue of 2.2 and explains 55% of variance

Distribution of Factor Scores Mean:.00 Median: -.05 Minimum: Maximum: 2.18 Skewness: -.19

Factor Score Re-scaled to 0-1 Range Mean: 0.56 Median: 0.56 Minimum: 0.00 Maximum: 1.00 Skewness: -.23

Implementation Index Is Related to Underlying Measures Y= X

How We’ll Use Index in Analyzing Effects Difference-in-difference approach – Divide providers into two groups: above and below median implementation – Compare difference between the two groups in differences between Time 2 and Time 1 Extend to continuous measure in regression Note: Method does not identify effects but may identify absence of effects

Summary Assessments of implementation from three perspectives are similar Summary index was developed successfully Uses of implementation index – Will be used to strengthen analysis of BJBC effects – Most useful in confirming absence of effects – Could be used to analyze factors affecting implementation