Download presentation
Presentation is loading. Please wait.
Published byAlexia Cross Modified over 8 years ago
1
Alpa Patel-Larson Behavioral Scientist Evaluation Studies Team, Program Evaluation Branch Division of HIV/AIDS Prevention Centers for Disease Control and Prevention American Evaluation Association Annual Meeting November 11, 2010 Using Lessons Learned to Improve the Design, Methods and Data Quality of Outcome Monitoring Projects with Community-based Organizations (CBOs) National Center for HIV/AIDS, Viral Hepatitis, STD & TB Prevention Division of HIV/AIDS Prevention, Program Evaluation Branch, Evaluation Studies Team
2
July 2004 July 2015 V/V (4 of 15) HR (7 of 23) SISTA (5 of 25) 139 CBOs (PA04-064) 133 CBOs (PA10-1003) 2006 July 2010 200520072009 3MV (3 of 6) MPowerment (3 of 12) WILLOW (4) RESPECT (4) 20082012201120132014 29 CBOs (PA06-618) Sep-06Sep-11Sep-08 Other FOAs: CBOs and HDs CBO HIV Prevention Programs with Outcome Monitoring Projects 2
3
Lessons Learned for Improving HIV Prevention Programs VOICES – One 45-minute session – Group-level of same gender/ethnicity – High-risk HIV- clients Healthy Relationships – Five 2-hour sessions – Group-level of same gender, if feasible – Clients living with HIV RESPECT – Two 20-minute sessions – Individual-level (can be done with HIV testing) – High-risk HIV- clients WILLOW – Four 4-hour sessions – Group-level with women – Clients living with HIV
4
Considerations for Federal or National Evaluation Projects Government policies and procedures (e.g. OMB) Agency policies and procedures (e.g. CDC programs) National program evaluations (e.g. PEMS, NHM&E) Funding agreements (e.g. master cooperative agreements) Multi-site evaluation projects Capacity of service providers for evaluation “Real-time” evaluations
5
CBOP Evaluation Components Impacted by Lessons Learned Source: Centers for Disease Control and Prevention. 2009 Quality Assurance Standards for HIV Counseling, Testing, and Referral Data. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention; 20092009 Quality Assurance Standards for HIV Counseling, Testing, and Referral Data http://www.cdc.gov/hiv/testing/resources/guidelines/qas/index.htm Data Collection Data Entry Data Management and Cleaning Data Submission Data Management and Cleaning Data Analysis and Utilization Data Verification Quality Assurance LOCAL/STATENATIONAL
6
Data Collection Original: List of national variables Data collection templates provided Recruit clients enrolled in intervention Baseline interviews before first intervention session Session logs and activities Solutions: Develop and pilot standard data collection forms with outcomes close to original research Increase amount of training during orientation and site visits Provide technical assistance for logistics with data collection Enhance process monitoring procedures and forms Problems: Variables were not similar to original research outcomes Differences in CBO capacities to document process monitoring and collecting detailed client-level risk behaviors
7
Data Entry Original: Use of large, complex national data system with established variables Solutions: Pilot testing of data entry systems prior to start of data collection QDS surveys to be collected and entered on hand-helds or laptops simultaneously Extensive training during orientation and site visits for data quality assurance processes and procedures Problems: Data entry screens did not match process of collecting variables System was new to all project staff and was not easy to navigate or flexible
8
Data Management and Cleaning Original: National data system managed by outside groups Staff at CDC converted shared data from quarterly text files into integrated SAS databases Error reports were provided to grantees 1-2 times during project period Solutions: Automated processes to convert text files to SAS databases were developed for future projects Separate data systems for real-time management, cleaning and feedback for grantees to analyze and utilize their own data Increased training and capacity for quality assurance to minimize cleaning at federal level Problems: Complex data management system needed additional programming CBO staff had to wait for CDC to manage and clean the data
9
Data Analysis & Utilization Original: Feedback through reports done by CDC after several months (error, agency, monthly status, etc.) Products to be drafted for use during project period Solutions: Use of different data systems for real-time reporting and use Transition planning and cross-training of staff via SOPs Ongoing feedback for program improvement throughout Multiple reviews to ensure data collected are used for evaluation questions Problems: High staff turnover and multiple steps and roles for project staff Lower than expected data quality and difficult-to-use measures
10
Data Security & Submission Original: Share data through national data system which would backup electronic data Submit other program data through secure data network Paper forms for evaluation were secured separately from program staff Solutions: Use alternative data system with project staff ability to ensure security and backup of data Data collected directly into electronic devices with paper print-out as backup and other electronic backups Additional staff trained to utilize SDN for all data submissions Problems: Submission procedures changed throughout projects Numerous paper forms with limited electronic backups
11
Data Verification & Quality Assurance Original: Sample strategy for QA was utilized by project staff Error reports generated by CDC from submitted or received data Conference calls and site visits were used primarily for QA and project management Solutions: All records to be QA’d throughout entire project at each step of data life cycle by CBO and CDC Routine site visits to review data quality Automated error reports to be generated routinely with opportunity for feedback QA logs by CBO are routinely submitted and reviewed by CDC Problems: Lower than expected data quality and difficult-to-use measurements Higher than expected data entry errors
12
Recommendations from CBOP To ensure the quality of data, evaluators should have easy, timely access to both the original and reported data Data collection instruments should be standardized and pilot tested Evaluators should develop a technical assistance plan that accounts for evaluation capacity variation among implementers Training and on-going technical assistance is critical and should be continually updated to reflect project specific issues
13
Why use Lessons Learned? Conducting high quality evaluations is an essential component of CDC’s work Real-time process and outcome monitoring can significantly improve program delivery and outcomes The lessons learned and experiences from CBOP can help improve future evaluation projects
14
Acknowledgements CBOP grantees Program Evaluation Branch CBOP Team Gary Uhl, Team Lead Brenda Chen, CBOP Data Manager/Analyst Holly Fisher, MEM POAdanze Eke, CBOP Analyst Tanisha Grimes Tamika Hoyte, CBOP QDA Renee Stein, CBOP-3MV POTanesha Griffin, CBOP PHA Eka Shapatava, CBOP QDA Tobey Sapiano, CBOP-SISTA POElizabeth Kalayil, CBOP QDA Andrea Moore, CBOP QDA Former Data Manager/Analysts: Qian An, Linda Andes, Venkat Mannam, Susan Moss
15
Questions? Comments? 15
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.