Measuring success in state home visiting Pew Home Visiting Campaign Data for Performance Initiative October 15, 2015.

Slides:



Advertisements
Similar presentations
One Science = Early Childhood Pathway for Healthy Child Development Sentinel Outcomes ALL CHILDREN ARE BORN HEALTHY measured by: rate of infant mortality.
Advertisements

Benchmark: Improved Maternal and Newborn Health Construct: Prenatal care Parental use of alcohol, tobacco, or illicit drugs Preconception care Inter-birth.
LeddyView Graph # 1 OUTLINE Background - RIte Care Rhode Island’s Title XXI Plans RIte Care Benefit Package Experience Impact on Health Care Access, Utilization,
The Nurse Family Partnership Program Clarissa Igle, RN Nurse Manager, Visiting Nurse Service of New York Nurse-Family Partnership March 26, 2009.
Measures of Child Well-Being from a Decentralized Statistical System: A View From the U.S. National Center for Health Statistics Stephen J. Blumberg, Ph.D.
Working Across Systems to Improve Outcomes for Young Children Sheryl Dicker, J.D. Assistant Professor of Pediatrics and Family and Social Medicine, Albert.
“It’s All About the Data” The Interface of Evaluation, Program Development, and Partnership to Address Substance Abuse and Reduce Child Abuse and Neglect.
File Review Activity Lessons learned through monitoring: Service areas must ensure there is documentation supporting the information reported in the self-
Early Success A framework to ensure that ALL children and families in the District of Columbia are thriving... CHILDREN & FAMILIES Community Supports Education.
Home Visiting Overview April 8, Help Me Grow A program for Ohio’s expectant parents, newborns, infants and toddlers.
Linking Actions for Unmet Needs in Children’s Health
The Achievement Gap: Lessons from the Early Childhood Longitudinal Study – Birth Cohort (ECLS-B) Tamara Halle, Nicole Forry, Elizabeth Hair & Kate Perper.
July 2013 IFSP and Practice Manual Revisions April 29, 2013 May 3, 2013 Infant & Toddler Connection of Virginia Practice Manual Infant & Toddler Connection.
Embedding the Early Brain & Child Development Framework into Quality Rating and Improvement Systems Meeting Name Presenter Name Date 1.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Family Resource Center Association January 2015 Quarterly Meeting.
DECISION SUPPORT RESEARCH TEAM “Providing expertise to improve health & wellbeing of families” Retention in a Study of Prenatal Care: Implications of attrition.
Veronica Ybarra-Tamayo, HSS Kathleen Ramos, Ph.D. Rick Brandt-Kreutz, LCSW First 5 Fresno Commission Presentation April 5, 2006 Healthy Steps Fresno: Partners.
Developmental Screening: What it Means for Early Learning Hubs November 21, 2013 Dana Hargunani, MD, MPH Child Health Director Oregon Health Authority.
One Community’s Approach Catherine McDowell, MS Project Manager Coos Coalition for Young Children and Families Charles Cotton, LICSW Area Director Northern.
Illinois Children’s Healthcare Foundation CHILDREN’S MENTAL HEALTH INITIATIVE Building Systems of Care: Community by Community Fostering Creativity Through.
1 Overview of IDEA/SPP Early Childhood Transition Requirements Developed by NECTAC for the Early Childhood Transition Initiative (Updated February 2010)
REAL-START : Risk Evaluation of Autism in Latinos (Screening Tools and Referral Training) Assuring No Child Enters Kindergarten With an Undetected Developmental.
DATA COLLECTION – WHAT IS NEEDED FOR BFI DESIGNATION: ARE WE THERE YET? MARINA GREEN RN MSN BREASTFEEDING COMMITTEE FOR CANADA APRIL,
1 Early Childhood Special Education Connecticut State Department of Education Early Childhood Special Education Maria Synodi.
Community Input Discussions: Measuring the Progress of Young Children in Massachusetts August 2009.
Medway FNP Annual Report Safeguarding vulnerable children Challenge How to protect and improve the outcomes for children whose parents have had.
WHAT DOES IT TAKE? 5 Lessons Learned from Supporting Evidence-based Home Visiting to Prevent Child Maltreatment Virginia Home Visiting Consortium Meeting.
1 copyright EDOPC Enhancing Developmentally Oriented Primary Care Swaying Systems and Impacting Lives.
CHIPRA Quality Demonstration Grant and Pennsylvania’s Early Intervention Strategies David Kelley MD, MPH Chief Medical Officer Office of Medical Assistance.
ELIZABETH BURKE BRYANT MAY 9, 2012 Building a Solid Foundation for Governors’ Education Reform Agendas through Strong Birth-to-3 rd Grade Policies.
Best Practices Outreach Management Case Management Expenses Management Common Mistakes.
Ionia County Great Start Collaborative Strategic Planning Reviewing trends from new information & data Setting Priorities for Goals & Strategies.
Can Bright Futures Be Implemented in a Busy Clinical Setting? Lessons Learned from the Preventive Services Improvement Project: A National Collaborative.
Mental Health Manual GIM Revisions PY Developmental Screening In close collaboration with all parents/legal guardians, the DA/CCP must annually.
BETTER BEGINNINGS Healthy Families A Report on the Health of Women, Children, and Families in Spokane Amy S. Riffe, MA, MPH/Elaine Conley, Director Spokane.
Big Strides for Small Patients: Developmental Screening in Pediatric Primary Care Department of Pediatrics Jerold Stirling, MD Rebecca Turk, MD Melanie.
Talking Points: The Pediatric Health Practitioner’s Role in School Readiness -- Enhancing the Content of Well-Child Care Charles Bruner, SECPTAN September,
Assessment with Children Chapter 1. Overview of Assessment with Children Multiple Informants – Child, parents, other family, teachers – Necessary for.
Coming Together for Young Children and Families.  What we know  Where we have been  Where we are today  Where we need to go.
Framework and Recommendations for a National Strategy to Reduce Infant Mortality July 9, 2012.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
San Bernardino County Children’s START: Screening, Triage, Assessment, Referral, & Treatment Amy Cousineau, Children’s Network Jenae Tucker, Desert Mountain.
Ingham Healthy Families. History: Why Healthy Families America? Michigan Home Visiting Initiative Exploration & Planning Tool (Fall 2013)  Ingham County.
Copyright restrictions may apply Household, Family, and Child Risk Factors After an Investigation for Suspected Child Maltreatment: A Missed Opportunity.
Michigan MIECHV CQI Learning Collaboratives January 14, 2014 Michigan MIECHV Grantee Meeting 1.
Evidence-Based Home Visiting Models Currently in Massachusetts Massachusetts Model Fact Sheets.
Addressing Maternal Depression Healthy Start Interconception Care Learning Collaborative Kimberly Deavers, MPH U.S. Department of Health & Human Services.
Key Considerations in Collecting Student Follow-up Data NACTEI May 15, 2012 Portland, OR Promoting Rigorous Career and Technical Education Programs of.
Predictors of Asthma in Young Children Does Reporting Source Affect Our Conclusions? Jane E. Miller Jane E. Miller, Ph.D. Institute for Health, Health.
Midwest Evaluation of the Adult Functioning of Former Foster Youth: Outcomes at Age 19 Chapin Hall Center for Children University of Chicago.
Early Intervention Program & Early Family Support Services: Analyzing Program Outcomes with the Omaha System of Documentation Presented to: Minnesota Omaha.
EVALUATING AN EHDI SYSTEM: PARENT SURVEY PROJECT Vickie Thomson, MA State EHDI Coordinator Colorado Department of Public Health and Environment Janet DesGeorges.
M eaningful Quality Measures for Children with Behavioral Health Conditions Discussion with the NYS Conference of Local Mental Health Hygiene Directors.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Pamela High MD 1 Pei Chi Wu MD 1 Stacey Aguiar MPH 2 Blythe Berger PhD 2 Autism CARES Meeting Bethesda, MD July 16, 2015.
What Is Child Find? IDEA requires that all children with disabilities (birth through twenty-one) residing in the state, including children with disabilities.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Discussion Background Objectives Office Based Prevention of Child Abuse and Neglect: Lessons Learned from the Practicing Safety QuIIN Project Diane Abatemarco,
Performance and Progress 2012/2013. Why We Do an Annual Data Presentation To assess the Levy’s performance in various categories against goals. To highlight.
Welcome! These slides are designed to help you think through presenting your benchmark planning and progress. Feel free to pick and choose the slides that.
Virginia Department of Health Staysi Blunt, Evaluator
Policy & Advocacy Platform April 24, 2017
IFSP Aligned with the Early Intervention Data System
TITLE IV-E WAIVER SITE VISIT
Breastfeeding Initiation and Extension
HV CoIIN Measures Maternal Depression
Inequality Starts Before Kindergarten
Strengthening a Community Through Evidence-Based Home Visitation
The Norwalk Story: How one community is using the Ages and Stages Questionnaires (ASQ®) to build a system for developmental screening for young children.
Presentation transcript:

Measuring success in state home visiting Pew Home Visiting Campaign Data for Performance Initiative October 15, 2015

2 Data for Performance Initiative The Data for Performance Initiative was designed to support states in effectively using data to improve practice and enhance state’s capacity to measure the shared impact of home visiting programs in their state. The Initiative: –Identifies common home visiting indicators for performance monitoring and accountability purposes –Builds on and leveraged existing data collection in states –Focuses on measuring progress toward shared outcomes

3 Agenda for today Recommended performance indicators and measurement approaches Strengthening the rigor of the measures – beyond performance measurement Some differences between the DPI approach and the MIECHV benchmark revisions

4 Value to the field Provides focus for collective action Improves understanding of what works for whom Highlights areas for innovation Provides pathway to assess future strategies Strengthens the evidence base

5 Criteria for selecting performance indicators Universally applicable across models and program (with, perhaps, an exception that recognizes prenatal vs. postnatal enrollment ) Achievable by the program rather than aspirational or heavily dependent on the performance of others Resonates with policymakers and the engaged public Important policy goal worthy of public investment Capitalizes on state administrative data, thereby reducing data collection burden for local programs

6 Indicators Maternal depression screening/referral Postpartum visit Interbirth interval Maternal educational achievement Maternal health and achievement Child development screening/referral Child development gains (data development agenda) Child maltreatment Well-child visits Maternal smoking or tobacco use Child health, development and safety Breastfeeding Parental capacity (data development agenda) Parental skills and capacity

7 Maternal Depression Screening/Referral Indicator: Percent of mothers participating in home visiting who are referred for follow-up evaluation and intervention as indicated by depression screening with a validated tool. Target population: Mothers participating in a home visiting program (prenatally and following birth). Numerator: Number of mothers participating in home visiting who received a maternal depression screening using a validated tool that indicated the need for referral and who were referred for follow-up evaluation and intervention. Denominator: Number of mothers participating in home visiting who received a maternal depression screening with a validated tool and whose screening results indicated the need for a referral. Data sources: Program data - screening results.

8 Suggestions for enhancing data quality and utility Collect actual scores from the screening whenever feasible. Consider measuring the percent with completed referrals and changes in depression status as part of focused quality improvement, research, and/or evaluation efforts. Consider recommending a common, validated, statewide depression- screening tool for use across home visiting programs and/or models. Measure at 2 or more points in time (e.g., prenatal and postpartum periods; intake and discharge). For women already in depression treatment, a referral would not be indicated, but these women should be included in efforts to measure changes in depression status. Augment with quality improvement measures developed by the Home Visiting Collaborative Improvement and Innovation Network ( coiin.edc.org) (e.g., percent of women referred to services with one or more- evidence-based service contacts, percent of women with improvement of depressive symptoms).

9 Postpartum Health Care Visit Indicator: Percent of mothers enrolled in home visiting prenatally or within 30 days of giving birth who receive a postpartum visit with a health provider within 2 months (60 days) following birth. Target population: Mothers enrolled in home visiting prenatally or within 30 days of giving birth. Numerator: Number of mothers enrolled in home visiting prenatally or within 30 days of giving birth who completed a postpartum visit with a health provider within 2 months (60 days) following birth. Denominator: Number of mothers enrolled in home visiting prenatally or within 30 days of giving birth who are at least 2 months (60 days) postpartum. Data sources: Program data—participant self-report is confirmed by medical records when possible.

10 Suggestions for enhancing data quality and utility Consider using a standardized question from the Pregnancy Risk Assessment Monitoring System (PRAMS) ( or another survey. Use opportunities to compare data on home visiting participants with those from Medicaid, health plans, PRAMS, or other sources. Consider measuring annual well-woman visits for home visiting programs that continue over a period of years.

11 Interbirth Interval Indicator: Percent of mothers participating in home visiting before the target child is 3 months old who have an interbirth interval of at least 18 months. Target population: Mothers participating in home visiting before the target child is 3 months old. Numerator: Number of mothers participating in a home visiting program before the target child is 3 months old who had an interbirth interval of at least 18 months. Denominator: Number of mothers participating in a home visiting program before the target child is 3 months old. Data sources: Administrative data—birth certificates

12 Suggestions for enhancing data quality and utility Use opportunities to compare data on home visiting participants with those from Medicaid, health plans, PRAMS, vital statistics, or other sources. Measure longer intervals such as 24 or 36 months or consider measuring average interbirth intervals.

13 Maternal Educational Achievement Indicator: Percent of mothers who entered home visiting without high school or GED completion who have enrolled in or completed high school or the equivalent. Target population: Mothers participating in home visiting. Numerator: Number of mothers who enter the program without a high school diploma or GED certificate who are either still enrolled in school or a GED program or who have successfully completed high school or received a GED certificate. Denominator: Number of mothers who enter a home visiting program without high school or GED completion. Data sources: Program data – participant self-report

14 Suggestions for enhancing data quality and utility Consider separate analyses for GED completion and high school graduation. Consider the use of a validated question from the Behavioral Risk Factor Surveillance System ( ques/2013%20brfss_english.pdf). ques/2013%20brfss_english.pdf For participants who have already attained high school or GED completion, states may additionally choose to measure enrollment or retention in work training, 2- and 4-year college degree programs, and/or increase in employment (e.g., hours or wages).

15 Child Development Screening Indicator: Percent of children participating in home visiting who are referred for follow-up evaluation and intervention as indicated by developmental screening with the Ages and Stages Questionnaire (ASQ). Target population: Children participating in home visiting Numerator: Number of children participating in home visiting who received developmental screening using the ASQ that indicated the need for referral and who were referred for follow-up evaluation and intervention as indicated. Denominator: Number of children participating in home visiting who received a developmental screening with the ASQ and whose screening results indicated the need for referral. Data sources: Program data– ASQ screening results

16 Suggestions for enhancing data quality and utility Consider measuring completed referrals or follow-up interventions as part of quality improvement, research, or evaluation. Collect actual ASQ scores (instead of adopting a pass or fail approach) and use the ASQ-recommended cutoff to determine whether referral is indicated, not a score set by the state or a program. Collect data at multiple points in time; however, because this is a screening and not a diagnostic evaluation, use caution in reporting change over time. Use opportunities to compare data on home visiting participants with those from Medicaid, health providers, early care and education, early intervention, child health surveys, or other sources. States may choose to collect data regarding the ASQ: Social Emotional as well as the ASQ, to screen for social-emotional risks and concerns. Augment with quality improvement measures developed by the Home Visiting Collaborative Improvement and Innovation Network (e.g., percent of children with parental concerns about development, percent of children referred to early intervention and deemed eligible).

17 Child Maltreatment Indicator: Percent of children participating in a home visiting program reported for child abuse and neglect. Target population: Children participating in the home visiting program. Numerator: Number of children participating in home visiting with a reported case of child maltreatment following enrollment in the program. Denominator: Number of children participating in home visiting. Data sources: Linkage of home visiting program data to child protective services administrative data.

18 Suggestions for enhancing data quality and utility Use a uniform exposure period (e.g., number of children reported within 3 years following program enrollment). Also, aim to extend the follow-up period as long as possible (research indicates that positive impacts on child maltreatment rates may not be evident in the near term). Consider tracking the dates of all reports involving the target population, along with the types(s) of child maltreatment (e.g., abuse, neglect) reported, as research suggests that home visiting may reduce repeat reports, but not necessarily initial reports, and that it may be more effective at reducing some types of maltreatment than others. Although substantiated child maltreatment reports are limited as a stand- alone measure, states may also wish to report the percentage of children participating in home visiting who are the subjects of 1 or more substantiated child maltreatment reports. Consider using a comparison group to determine if the proportion of participants with subsequent child maltreatment reports is comparable to a similar group of parents of young children who were not enrolled in home visiting. Maltreatment rates may be inflated for participants because of better detection by home visitors. Control for this bias by tracking the number of reports filed by the home visitor.

19 Well-child Visits Indicator: Percent of children participating in home visiting who received their last recommended visit based on the American Academy of Pediatrics’ Bright Futures schedule. Target population: Children participating in home visiting. Numerator: Number of children participating in home visiting who received their last recommended well-child visit since enrollment, based on the American Academy of Pediatrics’ Bright Futures schedule. Denominator: Number of children participating in home visiting. Data sources: Program data - parent report to home visitor confirmed by medical records when possible

20 Suggestions for enhancing data quality and utility Question is ideally asked at each visit. Consider use of a validated, standardized question from the National Child Health Survey ( ) or another national survey. Use opportunities to compare data on home visiting participants with those from Medicaid, health plans, pediatricians, child health surveys, or other sources. Note that there are 6 infant visits outside the birth hospital. An additional 7 visits are recommended before the 5 th birthday. The American Academy of Pediatrics recommends catching up at any point, so that the content of missed visits can be provided as soon as possible.

21 Maternal Smoking or Tobacco Use Indicator: Percent of mothers participating in home visiting who quit smoking or tobacco use following program enrollment. Target population: Mothers participating in home visiting who smoked or used tobacco at enrollment. Numerator: Number of mothers participating in home visiting who quit smoking or tobacco use. Denominator: Number of mothers participating in home visiting who smoked or used tobacco at enrollment. Data sources: Program data – participant self-report

22 Suggestions for enhancing data quality and utility Measure current smoking and/or tobacco use at multiple points in time: the prenatal period, postpartum at 2 months (if applicable), and/or annually thereafter; or at enrollment and exit from home visiting. Consider collecting data at subsequent intervals. Consider using a validated question about smoking from PRAMS ( ), the National Health Interview Survey ( estions.pdf ), or the Behavioral Risk Factor Surveillance System ( ques/2013%20brfss_english.pdf ). estions.pdfhttp:// ques/2013%20brfss_english.pdf Consider counting the number of cigarettes smoked over a given period of time in order to measure reduced tobacco use in addition to quit rate. This measure does not include e-cigarettes because federal guidelines are pending.

23 Breastfeeding Indicator: Percent of mothers enrolled in home visiting during pregnancy who initiate and continue breastfeeding for at least 3 months. Target population: Mothers enrolled in home visiting during pregnancy who give birth to a live infant. Numerator: Number of mothers enrolled in home visiting during pregnancy who initiate and continue breastfeeding for at least 3 months. Denominator: Number of mothers enrolled in home visiting during pregnancy who give birth to a live infant. Data sources: Program data – participant self-report and home visitor observation.

24 Consider measuring breastfeeding initiation using birth certificate or program data. For program data collection, consider use of a PRAMS survey question ( 508.pdf ) pdf Consider measuring exclusive breastfeeding. Consider measuring breastfeeding for duration(s) longer than 3 months (e.g., 6 months or 1 year) or consider measuring average duration of breastfeeding. Augment with quality improvement measures developed by the Home Visiting Collaborative Improvement and Innovation Network (e.g., percent of women who report intention to breastfeed, percent who initiate breastfeeding, percent of women exclusively breastfeeding at 3 or 6 months). Suggestions for enhancing data quality and utility

25 Summary of suggestions for enhancing data quality and utility Collect data at participant level Collect data in most basic, raw format Measure at least two time points for outcome indicators Determine dosage or threshold Use valid and reliable measurement tools Construct a comparison group of comparable families who did not receive home visiting

26 Descriptive Factors Candidate indicator topic Suggested examples Model or program typeName/title, site Child characteristicsDate of birth, full-term or preterm birth Maternal characteristics Date of birth, number of prior births, race, ethnicity, native language, whether mother was in contact with the father at time of program enrollment, residential address or ZIP code Participant service characteristics Date of first home visit, date of subsequent home visits, date and reason for termination of enrollment, including successful transitions and early terminations Program dataNumber of home visits, supervisor-to-home visitor ratio, average caseload

27 Data Development Agenda Develop feasible and reliable measure in key domains –Parental capacity –Child development gains Test feasibility of process and options for increasing the quality and utility of data