The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.

Slides:



Advertisements
Similar presentations
MSIP Accountability Plan
Advertisements

Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
SMART: Developing Effective Goals and Objectives Presented by Barry Nagle Evaluation and Action Research Associates (EARA) Fairfax, VA March 2009.
New Jersey Quality Single Accountability Continuum (NJQSAC) Recommendations for Change June 1, 2011.
P e r f o r m a n c e Measuring Results of Organizational Performance Lesson 1 Strategic Planning/ Performance Management Abstract.
Assessment Policies 1 Implementation and Monitoring American Institutes for Research February 2005.
1 Experiences of Using Performance Information in the Budget Process OECD 26 th March 2007 Teresa Curristine, Budgeting and Public Expenditures Division,
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Reporting Results for Pesticide Programs Robin Powell, EM Pyramid Lake Paiute Tribe Environmental Department.
3rd session: Corporate Governance
Program Performance Reporting and Evaluation in Australia Mark Nizette Department of Finance and Administration October 2001.
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
1 Early Childhood Special Education Connecticut State Department of Education Early Childhood Special Education Maria Synodi.
Adult Basic Education Trends and Changing Demographics Council for Basic Skills April, 2014 Prepared by David Prince and Tina Bloomer Policy Research.
Measuring Changes in Teachers’ Mathematics Content Knowledge Dr. Amy Germuth Compass Consulting Group, LLC.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Measuring Changes in Teachers’ Science Content Knowledge Dr. Anne D’Agostino Compass Consulting Group, LLC.
Measurement Standardization in Perkins The Perspective from the Integrated Performance Information (IPI) Project Data Quality Institute June 14, 2005 Bryan.
Setting and Adjusting Performance Goal Targets American Recovery and Reinvestment Act Performance Accountability Summit Gloria Salas-Kos U. S. Department.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
ADD Perspectives on Accountability Where are We Now and What does the Future Hold? Jennifer G. Johnson, Ed.D.
Accessing and Reporting State Student Achievement Data for GPRA Purposes Amy A. Germuth, Ph.D. Compass Consulting Group, LLC.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
A Brief History of the NRS American Institutes for Research February 2005.
Accreditation follow-up report. The team recommends that the college further refine its program review, planning, and resource allocation processes so.
HECSE Quality Indicators for Leadership Preparation.
AB 86: Adult Education Data and Accountability Forum Webinar
PART: An Innovative Way to Integrate Performance Information with the Budget by Jonathan D. Breul, Executive Director IBM Center for the Business of Government.
Student Growth Measures in Teacher Evaluation Using Data to Inform Growth Targets and Submitting Your SLO 1.
Annual Report. Submission of UCEDD Annual Report DD Act: required each center to submit annual report to the Secretary (ADD) DD Act: required each center.
0 Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation,
Knowing What ¢ount$: Connecting Performance to the Budget
Key Considerations in Collecting Student Follow-up Data NACTEI May 15, 2012 Portland, OR Promoting Rigorous Career and Technical Education Programs of.
Evaluating Assessment for the NRS Evaluating Assessments for the NRS American Institutes for Research February 2005.
PERKINS ACCOUNTABILITY New Administrator’s Meeting September 23, 2011 Krishnan Sudharsan Office of Career and Technical Education Michigan Department of.
Week 12: Performance Management and Performance Budgeting Discuss Eureka exercise Review mid-term Conceptual Origins of Performance Management Government.
NRS JEOPARDY! The Adult Education Community’s Favorite Quiz Show.
1.  Mapping Terms  Security Documentation  Predictor Table  Data Discussion Worksheet 2.
Taking the Mystery Out of Assessments: Five Essential Qualities of Any Assessment Donna Snodgrass, Ph.D. Formative Instructional Training Services President,
Kathy Corbiere Service Delivery and Performance Commission
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Master Plan Process FY Proposed Draft. October - February Cluster Groups and Units Identify Initiatives These are general goals or outcomes that.
Adult Education Assessment Policy Effective July 1 st, 2011.
OSEP Project Directors’ Conference Managing Your Grants 101 Terry Jackson – OSEP Shedeh Hajghassemali – OSEP July 22, 2008.
A Call to Action for 2016 Student Success Anson Green Director Texas Workforce Commission November 17, 2016 WIOA UPDATE NOVEMBER 17,
Multistate Research Program Roles & Responsibilities Eric Young SAAESD Meeting Corpus Christi, TX April 3-6, 2005.
Title III Native American and Alaska Native Children in School Program Grantee Performance Reporting June 19, 2014 Prepared under the Data Quality Initiative.
Updated Section 31a Information LITERACY, CAREER/COLLEGE READINESS, MTSS.
New York State’s Report Card Summary.  Evaluated in TWO Methods NYS Benchmarks Quartile Ranked Educational Gain = 46% Post Test Rate = 65% Follow Up.
Barbara Baran Senior Fellow California Budget Project.
EEC Annual Legislative Report January Context Legislative language requires EEC to submit an annual report on Universal Pre- Kindergarten (UPK)
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
OVERVIEW OF FEDERAL PERFORMANCE MANAGEMENT Community-Based Child Abuse Prevention (CBCAP) and Promoting Safe and Stable Families (PSSF) Grantees Meeting.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Director of Policy Analysis and Research
ABE Instructional Strategies: Getting Teachers Off to the Right Start
Advances in Aligning Performance Data and Budget Information:
Measuring Project Performance: Tips and Tools to Showcase Your Results
WIOA Accountability Ben Konruff
2018 OSEP Project Directors’ Conference
NWCCU update February 13, 2018.
Implementation Guide for Linking Adults to Opportunity
South Seattle Community College
Changing the Game The Logic Model
Technical Assistance Webinar
Accreditation follow-up report
Presentation transcript:

The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005

The Value of Data 2 Federal Uses of NRS Data Develop report to Congress Determine national progress for Program Assessment Rating Tool (PART) and Government Performance Results Act (GPRA) performance Assess national and state trends Monitor program outcomes and data quality Negotiate performance targets with states Determine whether states met prior performance targets

The Value of Data 3 State Uses of NRS Data Evaluate local program performance Promote and evaluate local program improvement efforts Report to legislatures Negotiate state performance targets with feds

The Value of Data 4 Importance of Data Critical to Federal accountability Supports funding Maintains unique program identity Performance standards: GPRA, Office of Management and Budget (OMB)

The Value of Data 5 Government Performance and Results Act (GPRA) Requires annual performance targets tied to program goals for all Federal programs Adult education’s targets part of ED’s Strategic Plan Targets—percentage who: Acquire basic skills to complete level (ABE, ESL) Transition to postsecondary education Obtain GED Enter employment

The Value of Data 6 GPRA Performance, 2000–2004

The Value of Data 7 GPRA Performance, 2000–2004

The Value of Data 8 GPRA Performance, 2000–2004

The Value of Data 9 GPRA Performance, 2000–2004

The Value of Data 10 GPRA Performance, 2000–2004

The Value of Data 11 Program Assessment Rating Tool (PART) OMB review process to enforce GPRA Every program reviewed & scored annually PART scores must be submitted to Congress with budget request Secretaries use PART scores to increase/decrease program $ request Congress uses PART scores to make appropriations (e.g., de-fund programs)

The Value of Data 12 PART (FY05) Findings for Adult Ed. Adult education’s scores (out of 100%): Program purpose & design (100%) Strategic planning (29%) Program management (67%) Program results (0%) Summary: “Results Not Demonstrated”

The Value of Data 13 PART “Results” Findings: Progress toward long-term goals Meet annual performance goals Demonstrate efficiency and cost effectiveness Compare favorably to other programs Independent evaluation of program effectiveness No

The Value of Data 14 PART

The Value of Data 15 PART Example from appropriations committee report: “The Committee recommends no funding for [this program]. [ED] has not developed performance indicators consistent with the requirements of GPRA…. the Committee has chosen to focus its resources on higher priority programs.” p. 197

The Value of Data 16 From the President’s 2006 Budget: [Reduced funding for this program] ” … is consistent with the Administration's goal of decreasing funding for programs with limited impact or for which there is little or no evidence of effectiveness. A PART analysis of the program … produced a Results Not Demonstrated rating. The program was found to have a modest impact on adult literacy, skill attainment and job placement, but data quality problems…. made it difficult to assess the program's effectiveness.”

The Value of Data 17 Independent Program Evaluation of Effectiveness One element of PART where we failed, evaluation study may be coming To show impact we need: Good assessment data Assessments correctly administered Pre-post scores Program models—program goals, approach, student participation Meaningful instructional approach—with standards or a framework

The Value of Data 18 Setting State Performance Standards Key to improving national performance Promote continuous improvements Problems resulting from: High intra-state variance across years Wide variation among states Vastly exceed negotiated level

The Value of Data 19 Intra-state variance across years

The Value of Data 20 Wide variation among states

The Value of Data 21 Vastly exceed negotiated level

The Value of Data 22 Setting State Performance Standards Compare with past years’ performance Compare to national and median range Equal or exceed actual performance Show continuous improvement State factors Initiatives, policies, politics Attendance, student variables

The Value of Data 23 Situation similar to 1995 Reauthorization—program will be evaluated Funding—in a time of huge deficits Renewed interest by the administration in block grants—workforce focus Need to demonstrate value and identity of adult education program Continued Vital Importance of Accountability Data

The Value of Data 24 Discussion Need for valid and reliable data to counter PART High quality data Improve GPRA measures for program Continuous program improvement is essential Demonstrate program effectiveness What improvements are needed?