Building Capacity to Conduct Scientifically and Culturally Rigorous Evaluations in Tribal Communities through the Tribal Home Visiting Evaluation Institute.

Slides:



Advertisements
Similar presentations
Affordable Care Act Tribal Maternal, Infant, and Early Childhood Home Visiting Program Overview Moushumi Beltangady Administration for Children and Families.
Advertisements

Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
Karen L. Mapp, Ed.D. Deputy Superintendent, Boston Public Schools
Family Resource Center Association January 2015 Quarterly Meeting.
Community-Based Child Abuse Prevention Program (CBCAP) 2006 Program Instruction Overview May 2006 Melissa Lim Brodowski Office on Child Abuse and Neglect,
Coordinating Center Overview November 18, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative: Year 1 Meeting 1.
Coordinating Center Overview November 16, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Diabetes Prevention Program Initiative: Year 1 Meeting 1.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Early Childhood Mental Health Consultants Early Childhood Consultation Partnership® Funded and Supported by Connecticut’s Department of Children and Families.
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
Diane Schilder, EdD and Jessica Young, PhD Education Development Center, Inc. Quality Rating and Improvement System (QRIS) Provisional Standards Study.
Participatory Evaluation Mary Phillips, BME Former Circles of Care Program Coordinator, Oakland and an Evaluator, Los Angeles, CA.
Affordable Care Act Maternal, Infant, and Early Childhood Home Visiting Program Health Resources and Services Administration Administration for Children.
Georgetown University National Technical Assistance Center for Children’s Mental Health 1.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
State and Regional Approaches to Improving Access to Services for Children and Youths with Epilepsy Technical Assistance Conference Call Sadie Silcott,
Addressing Maternal Depression Healthy Start Interconception Care Learning Collaborative Kimberly Deavers, MPH U.S. Department of Health & Human Services.
Building relationships and bridging social capital: An inclusive approach to immigrant civic engagement within libraries A PROCESS AND OUTCOME EVALUATION,
Midwest Child Welfare Implementation Center MCWIC Purpose Our purpose is to facilitate the implementation of systemic change to improve outcomes for children.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
1 A Multi Level Approach to Implementation of the National CLAS Standards: Theme 1 Governance, Leadership & Workforce P. Qasimah Boston, Dr.Ph Florida.
HRSA Early Childhood Comprehensive Systems (ECCS) Impact 2016 Funding Opportunity Announcement (FOA) Barbara Hamilton, Project Officer Division.
Jeopardy Game - Sample This is an example of a jeopardy game that could be used during data collection training. This is an example of a jeopardy game.
1 This project was supported by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under.
Welcome! These slides are designed to help you think through presenting your benchmark planning and progress. Feel free to pick and choose the slides that.
First Things First Grantee Overview.
Administration for Children and Families
Principles of Good Governance
The Enhanced AMS 2.0: The Foundation.
Building Processes for Conducting and Managing Data Collection
Strategies for Supporting Home Visitors with Data Collection
CT’s DCF-Head Start Partnership Working Together to Serve Vulnerable Families & Support the Development of At-Risk Children Presenters: Rudy Brooks Former.
Lorain City Schools 90 Day Entry Plan Update.
Clinical Practice evaluations and Performance Review
Putting Your Data to Work
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Supporting Community Priorities and Emphasizing Rigor An Approach to Evaluation Capacity Building with Tribal Home Visiting Programs Kate Lyon, MA Julie.
Tribal Home Visiting Evaluation Institute
Kate Lyon, MA, James Bell Associates, Inc.
Building Tribal Capacity for Home Visiting Evaluation through a Relational Technical Assistance Approach American Evaluation Association Annual Conference.
California's Early Learning and Development System Overview
Using Formative Assessment
Participatory Action Research (PAR)
Kathleen Amos, MLIS & C. William Keck, MD, MPH
Alaska Citizen Review Panel
Introduction to Program Evaluation
Cultural Competence and Consumer Involvement: Practice and Theory
Annual Plan Earlier this week, the SNA Board reviewed the progress we have made to date on the new Strategic Plan that was introduced last year.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Youth Participatory Evaluation in a Public School District
Family-Guided Routines-Based Intervention Introduction Module
HHS Strategic plan fy An Overview
Centers for Disease Control (CDC) Tribal Advisory Committee Update and Public Health Initiatives Amy Groom, MPH National Center for Chronic Disease Prevention.
Opportunities for Growth
Partnering for Success: Using Research to Improve the Lowest Performing Schools June 26, 2018 Massachusetts Department of Elementary and Secondary Education.
Implementation Guide for Linking Adults to Opportunity
Independent Living Services To Alaska Natives with Disabilities (IL-STAND) Program Needs Assessment Launch identifying IL barriers for Alaska Natives.
Evaluating Your Home Visiting Program
State Driven Research to Answer Key Policy Questions
Introduction Introduction
OSEP “Hot Topics in Early Childhood” Meeting
The Heart of Student Success
Key Stakeholders are aware of the Coalitions activities
Developmentally Appropriate Practices (DAP)
  High Impact Partners on behalf of the Employment and Training
Linda Mayo Willis and Carolyn Pope Edwards
Using State and Local Data to Improve Results
The Norwalk Story: How one community is using the Ages and Stages Questionnaires (ASQ®) to build a system for developmental screening for young children.
The Workforce Innovation and Opportunity Act
Presentation transcript:

Building Capacity to Conduct Scientifically and Culturally Rigorous Evaluations in Tribal Communities through the Tribal Home Visiting Evaluation Institute The Way Forward: ACF Research with American Indians and Alaska Natives April 2014 Erin Geary, Kate Lyon & Julie Morales James Bell Associates, Inc.

Port Gamble S’Klallam Tribe Overview What is the Tribal Home Visiting Evaluation Institute (TEI)? What is the Tribal Maternal, Infant, and Early Childhood Home Visiting program? How does TEI build capacity in Tribal communities? How does TEI’s approach align with the CB’s Roadmap for Collaborative and Effective Evaluation in Tribal Communities? Questions Port Gamble S’Klallam Tribe

TEI consists of: James Bell Associates, Inc. Johns Hopkins Bloomberg School of Public Health, Center for American Indian Health University of Colorado School of Public Health, Centers for American Indian and Alaska Native Mental Health MDRC

TEI Federal partners Office of Planning, Research and Evaluation Administration for Children and Families, Office of the Assistant Secretary for Early Childhood Development Office of Child Care

Tribal Maternal, Infant and Early Childhood Home Visiting Program Administered by ACF in collaboration with HRSA Funded through ACA, MIECHV includes 3% set aside for tribal program 25 cooperative agreements awarded to Tribes, Tribal consortia, Tribal organizations and urban Indian organizations 5-year grants that begin with a needs assessment and a planning year 3 cohorts: 13 in FY 2010, 6 awarded in FY 2011, 6 awarded in FY 2012 Grantees must report to ACF on performance measures, conduct rigorous evaluation and engage in continuous quality improvement activities Confederated Tribes of Siletz Indians

Tribal MIECHV program goals Supporting the development of healthy, happy, and successful AIAN children and families Implementing high-quality, culturally-relevant, evidence-based home visiting programs in AIAN communities Expanding the evidence base around home visiting interventions for Native populations Supporting and strengthening cooperation and coordination and promoting linkages among various early childhood programs, resulting in coordinated, comprehensive early childhood systems Wavoka Monument Yerington Paiute Tribe

TEI provides guidance on: Tracking and reporting on benchmarks (i.e., performance measures) Rigorous evaluation Data systems Continuous Quality Improvement Ethical dissemination and knowledge translation Taos Pueblo

Three Data Requirements Benchmarks - Demonstrate Performance Improvement Over Time Legislatively mandated; grantees develop their own performance measures and indicators; no client level data reported; 36 constructs TEI helps grantees develop a benchmark plan, prepare for and conduct data collection, and analyze and report data to ACF; data systems and data management are also TA topics Rigorous Evaluation- Answer a Focused Evaluation Question Using Rigorous Research Methods Grantees select question using CBPR approach; use rigorous design to answer question; focus on program impact, adaptations, or implementation strategy TEI helps grantees develop an evaluation question and design using the PICO approach, provide TA on developing IRB protocol, analysis and ethical dissemination of results Continuous Quality Improvement – Use Data to Identify and Test Changes to Improve Program Grantees select CQI topic like screening rates, family retention, breastfeeding initiation; use benchmark or other program data in a collaborative process to make data-driven improvements to program TEI assists grantees in preparing for and conducting Plan-Do-Study-Act cycles

Key difference is how the data are used and the comparisons made Grantees make decisions about the data to collect based on community priorities Key difference is how the data are used and the comparisons made Locally defined Benchmarks Local Rigorous Evaluation Continuous Quality Improvement We’ve learned that we need to be really clear about explaining the distinctions between the 3. Realized that everyone uses terms like evaluation and research a little differently. To some people, when they hear evaluation, they are thinking more about something like the benchmark component. Or when we describe the rigorous evaluation component, it sounds more like research to them. The same data can be used for multiple purposes TEI provides TA on all 3 activities

TEI Collaborative Process Federal Home Visiting Team DOHVE (State MIECHV TA Provider) Consultants TEI Federal Project Officers Tribal Early Childhood Research Center Model Developers VisTA

Grantee Home Visiting Team Tribal Evaluation Institute A relational technical assistance process Federal and TA Input Grantee Home Visiting Team Tribal Evaluation Institute Community Input Federal Team Community Advisory Board Evaluator(s) TEI Liaisons Tribal Home VisTA: Programmatic TA Tribal government or leadership Bi-directional flow of information Tribal Early Childhood Research Center Elders or cultural advisors Project Director, Coordinator and other staff

Grantee Home Visiting Team Tribal Evaluation Institute A relational technical assistance process Community Input Federal and TA Input Grantee Home Visiting Team Tribal Evaluation Institute Community Advisory Board Federal Team Evaluator(s) TEI Liaisons Tribal Home VisTA: Programmatic TA Tribal government or leadership Tribal Early Childhood Research Center Elders or cultural advisors Project Director, Coordinator and other staff

Grantee Home Visiting Team Tribal Evaluation Institute Conference calls Site visits TA modalities Grantee Home Visiting Team Tribal Evaluation Institute Grantee meetings Toolkits Webinars Feedback on plans

Developing and implementing a performance monitoring system (benchmarks) Benchmark Plan Draft Send to TEI Liaison TEI develops Feedback Receive written feedback Review Call Benchmark Consult with community Grantees develop and operationalize performance measures that correspond to 36 Federally mandated benchmark constructs Grantees select appropriate measures that correspond to community priorities and provide useful data for continuous quality improvement Iterative process of collaborating on benchmark plan as a program team, engaging with community, and working with TEI Bullet 1: Provide example of different ways to operationalize breastfeeding Bullet 2: Provide example of selecting immunizations as a measure of well-child visits (or some other example)

Rigorous evaluation Grantees develop an evaluation question that is important to the community and will contribute to the knowledge base Grantees select an evaluation design that is both rigorous and acceptable to the community Grantees are encouraged to narrow the focus of the evaluations due to resource limitations: Measure small set of outcomes Examine component of HV program Focus on implementation strategy (e.g., recruitment, retention) Evaluate enhancement or adaptation

Increased adherence to prenatal care Improved parent-child interaction opulation I ntervention C omparison Used to develop an evaluation question that connects Population, Intervention, Comparison and Outcomes Facilitate PICO discussion during initial site visit to help define home visiting intervention initially Return to PICO later for evaluation plan Give example of a PICO question from a grantee: “Do families who receive home visiting demonstrate greater knowledge of child development than families who receive services as usual?” We use the PICO approach because it’s a way of inserting the design into the evaluation question. This allows us to talk about how to establish causality and attribute outcomes to the intervention. PICO is a tool to help us develop capacity in our communities to understand research design and what different designs can and can’t tell us about the impact of a program. Something about how we start there, but then it’s usually a lengthy negotiated process to settle on a design that is the most rigorous possible but also acceptable to the community. O Increased adherence to prenatal care utcomes Improved parent-child interaction

Traditional Ways of Knowing Experimental Design Quasi-Experimental Design Pre-Post Design Case Studies Opinions, Ideas Life Experience Community Values What’s the purpose of this slide? Acknowledge value of traditional knowledge and indigenous ways of learning. Acknowledge that defining rigor is a contested process. Aim to blend these two approaches – Western Scientific and Indigenous ways of knowing. Our job is to build capacity to understand rigor from the Western Scientific perspective and make informed decisions about which design will provide the desired information and be appropriate. Negotiated process. Traditional Ways of Knowing Cultural Norms

Using data to improve programs (CQI) Grantees receive support to implement Plan-Do-Study-Act cycles to improve implementation fidelity/quality and outcomes PDSA is inclusive, transparent process that involves entire program team Use existing benchmark, evaluation, or implementation data Emphasis on discovery, learning and testing new approaches Evaluation is often regarded as a requirement rather than as a tool for addressing local questions and priorities and providing information of local use and value. Evaluation information has often been filed away without an attempt to share the information or use it to improve program services. It is critical to shift from evaluation-as-judgment to evaluation-as-learning and to show that evaluation can be a tool that improves programs and finds better ways to serve Tribal children and families.

A few hot TA topics Identifying ethical and meaningful comparisons for rigorous evaluation Evaluating cultural adaptations Identifying measures of cultural knowledge, identity, and practices & community connectedness Supporting intensive data collection efforts in Tribal communities Preparing home visitors to collect data in the home Addressing concerns around collecting data in small, isolated, close-knit communities. Identifying appropriate measures of parenting Lack of parenting measures that have been validated for Tribal communities Need for strengths-based measures that inform practice Use of observational measures and videotaping are often a cultural mismatch #2 - Data are collected by home visitors as opposed to evaluators or dedicated data collectors and data collection is integrated into service provision.

Relationship building – what’s worked Early site visits facilitate early relationship building Resources available for invaluable face-to-face time Intensive benchmark TA process builds foundation for evaluation TA Continuity of relationships is key Continuity helps TEI build understanding of grantees Trust is built by adding value Choctaw Nation of Oklahoma

Knowledge and skill building – what’s worked A phased approach to developing benchmark and evaluation plans is more efficient, easier to digest Be strategic about what TA to provide at individual versus universal level Peer learning opportunities Provide concrete examples and help translate them into unique grantee settings Modify approach based on grantee capacity and needs Target audience for TA extends well beyond evaluators Successive cohorts allow us room to improve Lake County Tribal Health Consortium

Support for community engagement in determining evaluation priorities TEI supports community engagement by: Encouraging participation from a diverse group of program and evaluation team members on planning calls Supporting changes to planning timelines to incorporate feedback/input Emphasizing requirement for/necessity of community engagement on site visits, webinars, phone calls, plan reviews Asking critical questions (have you presented this design to your advisory committee? How do you think your home visitors will feel about collecting these data? Etc.) Sharing experiences across grantees/cohorts (How some communities have reacted to certain observational measures, etc.) Better understanding the communities through face to face interaction with Tribal leadership, community partners, and families on site visits (These partners are often included in initial “PICO” exercise) Emphasize meaningful community input in all evaluation phases Example: Inter-tribal Council of Michigan Evaluation should incorporate meaningful community input in all phases, including: Determination of key questions (What do you want to know about the program, intervention, or service being provided?) Design of evaluation plan (How is information gathered to answer the key questions? How burdensome is it to participate?) Selection of appropriate measures (What tools, surveys, interview questions will be used?) Interpretation of findings (What does the information that has been gathered mean? Does it answer the key questions?) Dissemination – returning the knowledge gained to the community (How is the information shared? What publications, journals, newspapers, community forums, and so forth will disseminate the information?) Grants to Tribal nations and organizations that require evaluation provide opportunities for Tribes to exercise their sovereignty by identifying evaluation questions, engaging in evaluation design, and establishing indigenous evaluation protocols that can provide information to inform improvements in child welfare in their communities. Some highlights of ITCMI community engagement Very thorough needs assessment that informed evaluation priority Building literacy intervention (focus of evaluation)with team of evaluators (both internal and external), curriculum specialists, community partners, program staff, and advisory members Monthly webinars with 7 communities Frequent in-person meetings with representatives from each of the 7 communities “De-centralized” model that leaves many staffing and implementation decisions up to the local communities with oversight, close communication, and training provided by ITCMI TA with ITCMI- TA liaison participates in intervention planning calls to answer evaluation questions Internal evaluator provides initial document drafts, TA provider and other team members provide feedback and TA liaison facilitates planning calls Site visit included stops at several of the 7 communities

Evaluations anchored in community and cultural context Federal guidance allows for individualized as opposed to prescriptive approach – respects diversity Flexibility to define evaluation questions facilitates consideration of examinations of program model fit: Lead many grantees to examine cultural enhancements to home visiting models Lead some grantees to develop theories of change regarding native parenting practices Flexibility to define performance measures results in benchmark plans that reflect community context Example: In remote Native villages where subsistence patterns of living are the traditional way of life, changes in household income have little meaning; some grantees selected adapted measures of economic security Individualization (as opposed to prescriptive approach) address Tribe-specific history and contexts, beliefs, protocols, and program needs. Kodiak Area Native Association

Evaluations that blend scientific and cultural rigor Negotiating the balance: Recognizing the value of indigenous ways of knowing and methods of inquiry Application of appropriate and rigorous research methods Accommodating both will result in evaluation findings that will be meaningful to grantee community and to other tribal communities Example 1: Southcentral Foundation’s use of historical comparison group and propensity score matching to evaluate impact of model, plus qualitative study of cultural fit of model. Example 2: Pueblo of San Felipe’s evaluation of a newly developed cultural parenting curriculum evaluated using an internal comparison and qualitative methods Cultural rigor might include engagement of the entire community in the evaluation process or selected tribal experts/elders on specific cultural practices. Rigorous evaluation in Tribal communities means that sound scientific methods need to be employed but that they must also be grounded in sound cultural methods. Ongoing negotiations of the under-standing of indigenous ways of knowing and the concepts of scientific rigor validate both Native and Western perspectives ACF and TEI’s community-engaged approach results in evaluations that are more likely to have results that are meaningful and can be translated into practice. Importing methodology without regard to cultural context will result in non-rigorous evaluation.

Impacts of capacity building Tribal communities are better prepared to oversee and conduct evaluation and consume evaluation information Improved services and outcomes through the integration of data into decision making Communities are invested in and take ownership of their evaluation plans and benchmark data collection Strengthen the capacity and authority of communities to oversee evaluation Confederated Salish and Kootenai Tribes

Questions? Aleta Meyer, PhD, OPRE, Contracting Office Representative, aleta.meyer@acf.hhs.gov Moushumi Beltangady, MSW, MPP, ACF, Program Manager, moushumi.beltangady@acf.hhs.gov Kate Lyon, MA, JBA, TEI Project Director & TA Liaison lyon@jbassoc.com Julie Morales, PhD, JBA, TEI TA Liaison, morales@jbassoc.com Erin Geary, MSW, JBA, TEI TA Liaison, geary@jbassoc.com This product was created by JBA, Inc., under Contract No. HHSP23320095644WC, funded by the Office of Planning Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. The content of this product does not necessarily reflect the official views of the Office of Planning Research and Evaluation.

For more information on TEI contact: Nicole Denmark Kate Lyon The Tribal Home Visiting Evaluation Institute (TEI) is funded by the Office of Planning, Research and Evaluation, Administration for Children and Families, Department of Health and Human Services under contract number HHSP23320095644WC. TEI is funded to provide technical assistance to Tribal Home Visiting grantees on rigorous evaluation, performance measurement, continuous quality improvement, data systems, and ethical dissemination and translation of evaluation findings. TEI1 was awarded to MDRC; James Bell Associates, Inc.; Johns Hopkins Bloomberg School of Public Health, Center for American Indian Health, and University of Colorado School of Public Health, Centers for American Indian and Alaska Native Health. For more information on TEI contact: Nicole Denmark Kate Lyon Federal Project Officer Project Director Office of Planning Research and Evaluation James Bell Associates, Inc. nicole.denmark@acf.hhs.gov lyon@jbassoc.com The Tribal Evaluation Institute is funded by the Office of Planning, research and Evaluation within the Administration for Children and Families. TEI was awarded to James Bell Associates in partnership with the University of Colorado’s Centers for American Indian and Alaska Native Health and Michigan Public Health Institute. For more information, contact the individuals on this slide.