Overview of Data Quality Initiative and Reno Institute Vickie Schray, Chief Program Analysis Branch Division of Vocational & Technical Education
Presentation Outline DQI Background Changes for Reno Institute Negotiation Process for New Approach for Pilot-Testing Negotiation Process for Years 3, 4, & 5 CAR and State Accountability Plans Team Sessions State Feedback on New Approach
DQI Background Efforts to Date Core indicator framework State pilot projects Regional TA meetings State plan review process Need for the DQI Stakeholder confidence and credibility Support state Perkins implementation Managing continuous improvement
DQI Background Goals Improve student population definitions Improve selection and implementation of measures and measurement approaches Improve state systems for assuring data quality Guiding Principles Federal/state collaboration Peer networking Innovative use of technology
DQI Background New Orleans DQI—February, 2001 Improve population definitions Improve data quality for Core Indicators 1-3 Review quality criteria and scoring rubrics Review peer collaborative resource network Reno DQI—May, 2001 Improve data quality for special populations Improve data quality for Core Indicator 4 Review state baseline and performance levels DVTE Accountability Plan Negotiations—April-June, 2001
Changes for Reno Institute Accuracy of State Information More Time, Fewer States Per Group Improvements in Peer Evaluation Resource Guide More Information on State Systems Peer Review--Group Forms and Evaluation Written State Feedback
Negotiation Process for Phase I: Negotiating Initial Baseline and Performance Levels Round 1-- Questions and Issues Round 2-- Proposed Modifications and Levels Final--Interim Levels (#) Phase II: Negotiating Final Baseline and Performance Levels Verification, updating, providing missing information
New Process: Proposed State Changes Setting Baseline Levels Greater state choice in approaches and years Setting Performance Levels State-by-state negotiation of performance ceilings--not fixed 90%, state negotiated benchmarks State-by-state negotiation of 3-year and annual levels--not fixed improvement rates of 0.5% & 0.25 %
New Approach for Pilot-testing at Reno Setting Baseline Levels – most recent year or averaging up to 3 years New – greater choice of years and use of alternative objective, replicable methods Setting Performance Levels – fixed 90% ceilings and improvement rates of 0.5% & 0.25% New -- State-negotiated performance excellence benchmarks and flexible improvement rates resulting in comparable overall improvement
Negotiation Process for Years 3, 4, and 5 Round 1: Certification of Accuracy and Completeness of State Accountability Plans Timelines: April 19-May 9, 2001 Results Used for Reno Institute and Starting Negotiation Process-Rounds 2 and 3 Round 2: Verification and Negotiation of Baseline Levels Timelines: May 15-June 6, 2001 Modified Approach from Reno Institute
Negotiation Process for Years 3, 4, and 5 Round 3: Verification and Negotiation of 3-Year and Annual Performance Levels Timelines: May 10-June 29, 2001 Modified Approach from Reno Institute
CAR and State Accountability Plans CAR Database Annual reporting of actual performance for most recent program year. December 2001 ( ) first year for reporting against agreed-upon performance levels from State Plan December 2000 ( ) used for first Report to Congress State Accountability Plan Negotiated baseline levels Negotiated performance levels
Team Session 1: Overview of State Systems Definition and Size of Student Populations Implications of definitions for performance Definitions of Measures and Measurement Approaches Student populations being addressed Data Quality Improvement Priorities Implications for performance levels
Team Session 2: Non-Traditional Participation and Completion Identifying Non-Traditional Programs Approach for linking to national/state occupational data Measure Construction Student populations being addressed Data Quality Improvement Priorities Student/program coverage--all non-trad programs Implications for performance levels
Team Session 3: Special Populations Statewide Definitions Procedures for Identifying Students Reliability of Classification Comparable Student Coverage of Special Populations on Performance Measures
Team Session 4: Setting Baseline Performance Levels Selection of Years Use of most recent data Method for Calculating Baseline Levels Formal, objective and replicable by others Adjusting to Data Quality Improvements
Team Session 5: Setting 3-Year and Annual Performance Targets Determining Performance Excellence Benchmarks State approach based on national/state data Setting 3-Year and Annual Performance Levels “Fact-based” decision-making--gap analysis and expected results from strategies Comparable Improvement Comparing overall improvement Justification of lower overall improvement
State Feedback on New Approach Report Out on Peer Review Process and Approach Determining Performance Ceilings Setting Baseline Levels Setting Performance Targets Additional State Feedback Regional Meetings Peer Collaborative Resource Network Incorporating Changes into Round 2 and Round 3 Negotiations
SubindicatorBaseline Year(s)Method 1S Most Recent Year 1S Most Recent Year 2S , , Year Average 3S Most Recent Year 4S16 Years ( )Working Group Recommendation 4S26 Years ( )Working Group Recommendation 1P , , Year Average 1P , , Year Average 2P , , Year Average 3P , , Year Average 4P16 Years ( )Working Group Recommendation 4P26 Years ( )Working Group Recommendation