Data Quality Review: Best Practices Sarah Yue, Program Officer Jim Stone, Senior Program and Project Specialist.

Slides:



Advertisements
Similar presentations
Financial Literacy for Executives and Non-financial Staff Bonnie Janicki, Corporation for National and Community Service.
Advertisements

Performance Measurement: Not just for breakfast anymore Performance Measurement and the Strategic Plan Senior Corps Virtual Conference August 2012.
Alignment and Quality 2/8/2014 Alignment and Quality 1 What is it? What do I need to look for?
MONITORING OF SUBGRANTEES
2 Session Objectives Increase participant understanding of effective financial monitoring based upon risk assessments of sub-grantees Increase participant.
Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Donald T. Simeon Caribbean Health Research Council
Case Studies – Australia Ross Attrill – International IDEA.
The New Rules Forum National Conference 2005 – Washington, DC.
August 15, 2012 Fontana Unified School District Superintendent, Cali Olsen-Binks Associate Superintendent, Oscar Dueñas Director, Human Resources, Mark.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Final Determinations. Secretary’s Determinations Secretary annually reviews the APR and, based on the information provided in the report, information.
2014 AmeriCorps State and National Symposium Updates and Fixed Awards.
Return On Investment Integrated Monitoring and Evaluation Framework.
The Academic Assessment Process
PPA 502 – Program Evaluation Lecture 5b – Collecting Data from Agency Records.
Course Technology Chapter 3: Project Integration Management.
Competitive Grant Program: Year 2 Meeting 2. SPECIAL DIABETES PROGRAM FOR INDIANS Competitive Grant Program: Year 2 Meeting 2 Data Quality Assurance Luohua.
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
How to Write Goals and Objectives
How to Develop the Right Research Questions for Program Evaluation
AmeriCorps 101 AmeriCorps 101 Basic AmeriCorps Terminology & Concepts for Reviewers AmeriCorps External Review.
2014 AmeriCorps External Reviewer Training
Ensuring an Equitable Review AmeriCorps External Review Training.
U.S. DEPARTMENT OF LABOR EMPLOYMENT AND TRAINING ADMINISTRATION ARRA GREEN JOB AND HEALTH CARE / EMERGING INDUSTRIES NEW GRANTEE POST AWARD FORUM JUNE.
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
Reimbursements, Reporting & Budget Modifications
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Presented to President’s Cabinet. INTERNAL CONTROLS are the integration of the activities, plans, attitudes, policies and efforts of the people of an.
Performance Measurement Overview 9/18/2015 Performance Measurement Overview 1 What is it? Why do it?
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Employment and Training Administration DEPARTMENT OF LABOR ETA Grant Management and Monitoring – Discretionary Grantees Southern California Training and.
Welcome! Please join us via teleconference: Phone: Code:
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
2013 NEO Program Monitoring & Evaluation Framework.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Accreditation (AdvancED) STANDARD #4: RESOURCES & SUPPORT SYSTEMS
Session 6: Data Flow, Data Management, and Data Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Instructional Leadership Supporting Common Assessments.
1 Auditing Your Fusion Center Privacy Policy. 22 Recommendations to the program resulting in improvements Updates to privacy documentation Informal discussions.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Software Quality Control and Quality Assurance: Introduction
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Presenter: Christi Melendez, RN, CPHQ
2018 OSEP Project Directors’ Conference
Assuring the Quality of your COSF Data
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Internal Control Internal control is the process designed and affected by owners, management, and other personnel. It is implemented to address business.
Quality Assurance in Clinical Trials
Assuring the Quality of your COSF Data
Presentation transcript:

Data Quality Review: Best Practices Sarah Yue, Program Officer Jim Stone, Senior Program and Project Specialist

Session Overview Why the emphasis on data quality? What are the elements of high-quality data? How should grantees and commissions assess data quality for their sites/subgrantees? What are some common “red flags” to look for when reviewing programmatic data? What corrective actions should grantees and commissions take if they encounter data quality issues? What resources are available to help with data quality?

Why Data Quality is Important Fundamental grant requirement Trustworthy story of collective impact for stakeholders Sound basis for programmatic and financial decision- making Potential audit focus

Elements of Data Quality Validity Completeness Consistency Accuracy Verifiability

Validity Official definition: Whether the data collected and reported appropriately relate to the approved program model and whether or not the data collected correspond to the information provided in the grant application Plain language definition: The data mean what they are supposed to mean How grantees/commissions should assess data validity for sites/subgrantees: –Review data collection tools; compare to objectives and PMs –Ask about data collection protocols –Request a completed data collection tool

Validity Examples of potential problems: –Invalid tools Mismatch between tool and type of outcome Incorrect approach to pre/post testing Failure to use tools required by the National Performance Measure Instructions –Invalid data collection protocols Assessing the wrong population Implementing protocols in the wrong ways or times

Completeness Official definition: The grantee collects enough information to fully represent an activity, a population, and/or a sample. Plain language definition: Everyone is reporting a full set of data How grantees/commissions should assess data completeness for sites/subgrantees: –Keep a list of data submissions –Check source documentation –Request data at consistent intervals

Completeness Examples of potential problems: –Subgrantees/sites missing from totals –Non-approved sampling –Inconsistent reporting periods

Consistency Official definition: The extent to which data are collected using the same procedures and definitions across collectors and sites over time Plain language definition: Everyone is using the same data collection methods How grantees/commissions should assess data consistency for sites/subgrantees: –Request written data collection procedures; ask about dissemination/implementation –Cross-compare definitions and protocols

Consistency Examples of potential problems: –No standard definitions/methodologies –Lack of knowledge about required procedures and/or failure to follow them –Staff turnover

Accuracy Official definition: The extent to which data appear to be free from significant errors Plain language definition: The math is correct How grantees/commissions should assess data accuracy for sites/subgrantees: –Check your addition –Request data points from service locations –Check for double-counting in source data –Watch out for double-reporting

Accuracy Examples of potential problems: –Math errors –Counting individuals multiple times –Reporting the same individuals under different program streams

Verifiability Official definition: The extent to which recipients follow practices that govern data collection, aggregation, review, maintenance, and reporting Plain language definition: There is proof that the data are correct How grantees/commissions should assess data accuracy for sites/subgrantees: –Require source documentation –Review quality control plans

Verifiability Examples of potential problems: –Inexplicable data points –Estimated values

Common “Red Flags” in Reported Data a)Large completion rates reported early in the program year b)Actuals that are exactly the same as target values or consist solely of round numbers c)Actuals that are substantially higher or lower than target values or are out of proportion to the MSY or members engaged in the activity d)Substantial variation in actuals from one site to another e)Substantial variation in actuals from one program year to the next f)Outcome actuals that exceed outputs g)Outcome actuals that are exactly the same as outputs

Example #1 Measure Type or Resource Type Measure #TargetActual Output O1: Number of econ disadv individuals receiving financial literacy services OutcomeO9: Individuals with improved financial knowledge Financial Literacy Education Program – Financial Literacy PM 1

The number of participants with increased knowledge (outcome) is exactly the same as the number of program participants (output) Possible explanations: –The program is extremely effective in achieving improved financial knowledge among program participants –The number of individuals with improved financial knowledge is not being reported correctly Example #1 Continued Measure Type or Resource Type Measure #TargetActual Output O1: Number of econ disadv individuals receiving financial literacy services OutcomeO9: Individuals with improved financial knowledge Financial Literacy Education Program – Financial Literacy PM 1

Example #1 Additional Context

This assessment is primarily a customer satisfaction survey that does not objectively measure changes in knowledge To ensure validity, the grantee should use a pre-post assessment tool that asks content-based questions directly related to the subject matter

Example #2 Demographics InformationValue Number of individuals who applied to be AmeriCorps members116 Number of episodic volunteers generated by AmeriCorps members389 Number of ongoing volunteers generated by AmeriCorps members68 Number of AmeriCorps members who participated in at least one disaster services project 44 Number of disasters to which AmeriCorps members have responded 10 Number of individuals affected by disasters receiving assistance from members 2,000 Number of veterans serving as AmeriCorps members2 Demographics

Example #2 Continued The number of individuals affected by disaster receiving assistance from members is an unusually round number Possible explanations: –Members served exactly 2,000 individuals over the course of the program year –The number of individuals receiving assistance from members is not reported correctly Demographics InformationValue Number of individuals who applied to be AmeriCorps members116 Number of episodic volunteers generated by AmeriCorps members389 Number of ongoing volunteers generated by AmeriCorps members68 Number of AmeriCorps members who participated in at least one disaster services project 44 Number of disasters to which AmeriCorps members have responded10 Number of individuals affected by disasters receiving assistance from members 2,000 Number of veterans serving as AmeriCorps members2 Demographics

Example #2 Additional Context

The grantee estimated the value of the demographic indicator rather than measuring it To ensure verifiability, the grantee should ensure that they have specific data collection procedures for all reported values and are maintaining source documentation for each number

Corrective Actions for Data Quality Issues Notify your CNCS Program/Grants Officer and discuss best way forward For issues related to invalid tools, incorrect protocols, or wrong definitions: –Switch to the correct tool/protocol/definition immediately if feasible; if not, switch in the next program year –Do not report data collected using incorrect tool/protocol/definition; document reasons for not reporting For issues related to incomplete reporting, math errors, or double- counting/double-reporting: –Correct the values –Put together quality control procedure

Corrective Actions for Data Quality Issues For issues related to missing/incomplete source documentation: –Develop a system for retaining source data –Do not report values for which there is little/no evidence. Document reasons for not reporting.

Resources for Data Quality Review Performance Measurement Core Curriculum ( measurement/training-resources) measurement/training-resources –Performance Measurement Basics –Theory of Change –Evidence –Quality Performance Measures –Data Collection and Instruments

More Resources for Data Quality Review Evaluation Core Curriculum: Implementing an Evaluation ( ementing-evaluation) ementing-evaluation –Basic Steps of an Evaluation –Data Collection –Managing an Evaluation CNCS Monitoring Tool: Data Quality Review Tab

CNCS Monitoring Tool

Q&A What questions do you have?

Thank you! If you have questions or feedback about this presentation, feel free to contact us: Sarah Yue: Jim Stone: