Building Processes for Conducting and Managing Data Collection

Slides:



Advertisements
Similar presentations
WCSWeb Hearing Link Data System and User Training Protocol Marcia Fort, Au.D., CCC-A Kathy Gajan, M.A., CCC-SLP Jason Guetgemann, M.S., CCC-SLP Jerry Ramsey,
Advertisements

Money: NPCC Community Grants Program Rosalina James, PhD.
A SOUND INVESTMENT IN SUCCESSFUL VR OUTCOMES FINANCIAL MANAGEMENT FINANCIAL MANAGEMENT.
Coles Elementary School Volunteer Training
Lisa Denney, MPH HRPP Assistant Director Melanie Mace, MA HRPP Education and Training Coordinator Bill Woods, PhD CAPS Policy and Ethics Core November.
MODULE II 1 How are UCEDDs Connected?. Topics of Presentation 1. Administration on Intellectual and Developmental Disabilities (AIDD) 2. Association of.
Host Agency Safety Consultation What the law says you must do.
Starting a CASA Program in Tribal Court Valerie Dudley, Rural Programs Specialist BIA Provider’s Conference 2013.
0 Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation,
Child Outcome Data Broad Data Analysis. Broad Analysis: Child Outcomes Does our state’s data look different than the national data? Are our state child.
1 MODULE II How are UCEDDs Connected?. 2 Topics of Presentation 1. Administration on Developmental Disabilities (ADD) 2. Association of University Centers.
New England Region Homeless Management Information System PATH Integration Into HMIS Richard Rankin, Data Remedies, LLC Melinda Bussino, Brattleboro Area.
Jeopardy Game - Sample This is an example of a jeopardy game that could be used during data collection training. This is an example of a jeopardy game.
1 Welcome! Choose a photo from the table that appeals to you or represents you in some way. Write the answers to the following questions on a 3×5 notecard.
Methodological Issues in Needs Assessment for Quality Assurance in a National Context: The Case of Head Start Needs Assessment Hsin-Ling (Sonya) Hung,
Dennis Embry, PhD Paxis Institute Kate Lyon, MA James Bell Associates, Inc. Aleta Meyer, PhD Office of Planning, Research, and Evaluation Administration.
Welcome! These slides are designed to help you think through presenting your benchmark planning and progress. Feel free to pick and choose the slides that.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Building Capacity to Conduct Scientifically and Culturally Rigorous Evaluations in Tribal Communities through the Tribal Home Visiting Evaluation Institute.
First Things First Grantee Overview.
Administration for Children and Families
Michael Lindsay, ICF International
A FRUIT AND VEGETABLE PRESCRIPTION PROGRAM
Recruitment retention engagement
Office of Head Start Administration for Children and Families
Sharing your CQI Story: Creating a CQI Story Board Tribal MIECHV Annual Grantee Meeting Washington, DC May 6, 2015.
Strategies for Supporting Home Visitors with Data Collection
Putting Your Data to Work
Supporting Community Priorities and Emphasizing Rigor An Approach to Evaluation Capacity Building with Tribal Home Visiting Programs Kate Lyon, MA Julie.
Tribal Home Visiting Evaluation Institute
Kate Lyon, MA, James Bell Associates, Inc.
Building Tribal Capacity for Home Visiting Evaluation through a Relational Technical Assistance Approach American Evaluation Association Annual Conference.
Getting the Most Out of Your Evaluation Partnership
The Federal programs department September 26, 2017
Every Student Succeeds Act (ESSA)
Governor Visits to School
Family Engagement Coordinator Meeting July 25, 2018
Defining Quality Assurance: What It Takes to Make a Good Contact
NO The Right to Say NO by Steven Powe
MCAS Administration Training
Promising Practices for Increasing Certificate and Credentialing Outcomes H-1B Ready to Work.
2.6: Data System Business Process Maps
To start the presentation, click on this button in the lower right corner of your screen. The presentation will begin after the screen changes and you.
Creating a P.L Plan.
Socorro Independent School District
2018 OSEP Project Directors’ Conference
Northwest Tribal Epidemiology Center
Assuring the Quality of your COSF Data
Socorro Independent School District
Every Student Succeeds Act (ESSA)
Grantee Guide to Project Performance Measurement
Evaluating Your Home Visiting Program
2.6: Data System Business Process Maps
Data Quality 101: What is Data Quality
Model T(eamwork) in The Aid Office
Effectively Training Parents in Behavior Analytic Interventions
Governance: Roles and Responsibilities
Using Data to Monitor Title I, Part D
The Alliance for Wisconsin Youth: The Role of Prevention Coalitions in Addressing Substance Use Disorders Elysse Chay, Prevention Manager Public Policy.
Every Student Succeeds Act (ESSA)
Whose Job Is It? Part Two At the Board Table Discussion Tool
Session: 9 On-going Monitoring & Follow Up
Fahrig, R. SI Reorg Presentation: DCSI
Involving Families Early Childhood Outcomes Center.
Every Student Succeeds Act (ESSA)
TEMPLATE – Annual Title I Meeting
NO The Right to Say NO by Steven Powe
Employee Cybersecurity Program
Assuring the Quality of your COSF Data
Presentation transcript:

Building Processes for Conducting and Managing Data Collection Erin Geary Tribal MIECHV Annual Grantee Meeting May 2015

Objectives To introduce tools for improving timeliness and quality of data collection** To present best practices for institutionalizing data collection in your program To hear from you what strategies or tools are helping you improve data collection **Tools come from TEI’s Data Collection Toolkit– Coming Soon!

What data are collected? Who collects the data? When are data collected? How are data entered? How is data collection monitored? How are data used?

Why “systematize” data collection

Collecting data on time

Considerations “On-time” window can look different for different measures Ticklers and alerts can be helpful but…. It is helpful to talk through scenarios with staff

Discussion How have you addressed data collection scheduling? What’s worked (tips, tools, etc.)?

Data Collection Tracking Tool Screenshot of Tool and Timepoint entry screen

Data Collection Tracking Tool Screenshot of Scheduling Template

Monitoring Data Quality

In many programs, quality happens (or doesn’t) when… 1) data are collected and 2) when data are entered

Monitoring data collection Training/role play Shadow Collection while being observed Independent data collection Monitoring data collection

Monitoring data entry- knowing what to look for Data entry “red flags” Lots of missing data Values outside the normal range Lack of variation Score patterns that seem out of the ordinary Is the error part of a larger issue or a one time mistake? Are there patterns related to certain staff? What happens when you sort by date? Are certain measures or certain questions being skipped?

Discussion How have you monitored data quality? What issues are you seeing show up regularly? What’s worked for monitoring quality (tips, tools, etc.)?

Quality Assurance Form

Institutionalizing Data Collection

Best Practices- Write it down… all of it Policies and Procedures Manuals are your friend! EXAMPLE- Quality checks How frequently are data checked? By who? Sample of data or all of it? What will they look for? How does the review come back to staff?

Data Collection Protocol Outline

Best Practices- Consistent messaging Training, in-services, weekly meetings, supervision, written memos are all good platforms for communicating data collection procedures BUT…. It is important to ensure that all staff are getting the same messages

Sample Training Schedule

Best Practices- Be supportive Home visitors play a key role in whether data collection is institutionalized in a program Acknowledge that data collection can be challenging AND emphasize why it matters Highlight good work Ask for Home visitor’s input and leadership (form redesign, issues with measures, training new staff, etc.)

Final thoughts? Has this generated other ideas or potential strategies? Are there tools you might take back and use?

Thank you for coming!

For more information on TEI contact: Nicole Denmark Kate Lyon The Tribal Home Visiting Evaluation Institute (TEI) is funded by the Office of Planning, Research and Evaluation, Administration for Children and Families, Department of Health and Human Services under contract number HHSP23320095644WC. TEI is funded to provide technical assistance to Tribal Home Visiting grantees on rigorous evaluation, performance measurement, continuous quality improvement, data systems, and ethical dissemination and translation of evaluation findings. TEI1 was awarded to MDRC; James Bell Associates, Inc.; Johns Hopkins Bloomberg School of Public Health, Center for American Indian Health, and University of Colorado School of Public Health, Centers for American Indian and Alaska Native Health. For more information on TEI contact: Nicole Denmark Kate Lyon Federal Project Officer Project Director Office of Planning Research and Evaluation James Bell Associates, Inc. nicole.denmark@acf.hhs.gov lyon@jbassoc.com The Tribal Evaluation Institute is funded by the Office of Planning, research and Evaluation within the Administration for Children and Families. TEI was awarded to James Bell Associates in partnership with the University of Colorado’s Centers for American Indian and Alaska Native Health and Michigan Public Health Institute. For more information, contact the individuals on this slide.