Delivering on the Promise of Data in Education Winter Forum February 2013 Jack Buckley Commissioner National Center for Education Statistics.

Slides:



Advertisements
Similar presentations
Global Platform for Disaster Risk Reduction, June 2009, Geneva Special event: Accessing space-based information to support disaster risk reduction,
Advertisements

Agenda For Today! School Improvement Harris Poll Data PDSA PLC
Agenda For Today! Professional Learning Communities (Self Audit) Professional Learning Communities (Self Audit) School Improvement Snapshot School Improvement.
The Continuous Improvement Classroom Please sit by SIP goal area 1.) Find the SIP goal area that matches the Action Research area you work on.
Don’t Waste My Time: Here’s Why Our Data Look Bad and What We Really Need to Improve the Quality of the Data NCES MIS Conference 2012.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Data lifecycle Data Management & Survey Conception
A Closer Look at Verifying the Integrity of NAEP Commissioner Jack Buckley National Assessment Governing Board Meeting August 5, 2011.
California Child Welfare Co-Investment Partnership Children’s Conference Monterey, California May 29, 2008.
Surfing the Data Standards: Colorado’s Path 2012 MIS Conference – San Diego Daniel Domagala, Colorado Department of Education David Butter, Deloitte Consulting.
Statistics HAS Sources of knowing Tenacity Intuition Authority Personal experience Reasoning Deductive Inductive.
Measuring the Aggregate Economy The government is very keen on amassing statistics... They collect them, add them, raise them to the n th power, take the.
The Center for IDEA Early Childhood Data Systems 2014 Improving Data, Improving Outcomes Conference September 9, 2014 State Panel on the Benefits of CEDS:
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
The Public Health Vision Michael T. Osterholm, PhD, MPH Director, Center for Infectious Disease Research & Policy Director, Minnesota Center of Excellence.
Challenges in Developing a University Admissions Test & a National Assessment A Presentation at the Conference On University & Test Development in Central.
3R’s: Research, Retention and Repayment Predicting the Future by Understanding Current and Past Students.
PROJECT UPDATE Electronic Technologies and Fisheries Data Collection George Lapointe George Lapointe Consulting September 23, 2013 Pacific States Marine.
No Child Left Behind State Communication Strategies KSA-Plus Communications July 2002 Prepared with support from a grant from the U.S. Department of Education.
The Wallace Foundation’s Approach to Scale Will Miller President, The Wallace Foundation Professional Learning Community Expanded Learning Opportunities.
CEDS Standard Update.
Ross Santy, U.S. Department of Education Beth Young, Quality Information Partners.
Leaky Education Pipeline Of every 100 students who enter kindergarten: 71 graduate from high school 42 enter a community college or university 18 receive.
+ Facts + Figures: Your Introduction to Meetings Data Presented By: Name Title Organization.
Providing Full Service Technology Support in Self-Service Times Objective: To discuss the challenges, pitfalls and opportunities of the current technology.
Fundamentals of Data Analysis Lecture 4 Testing of statistical hypotheses.
How We Approach Leadership in a High-Performing Schools Dr. Akram M. Zayour Dubai International School AlQuoz Branch 9/19/20151.
Developing Professional Learning Communities To Promote Response to Intervention Linda Campbell Melissa Nantais.
MEET U.S. Performance Measurement Confidential – Do not Distribute NP STRATEGIES MEASURING PERFORMANCE IN THE NONPROFIT ORGANIZATION MEET U.S. In-Region.
1 Informing a Data Revolution Getting the right data, to the right people, at the right time, on the right format Johannes Jütting, PARIS21 Tunis, 8 Decemeber.
SponsorProblem AssessRisk SolutionStrategy Measures of Merit (MoM) Human & OrganisationalIssues Scenarios Methods & Tools Data Products
Introduction: Thinking Like an Economist CHAPTER 7 Measuring the Aggregate Economy The government is very keen on amazing statistics…They collect them,
Chapter Two Measuring Offenses by and against Juveniles.
Improvement Guide Workshops Chapters 1-14 Suggested Workshops for participants who are applying the Improvement Guide to Projects.
1. Introduction to Price Fixing: Legal and Economic Foundations Antitrust Law Fall 2015 NYU School of Law Dale Collins SLIDES FOR CLASS.
COALITION OF COMMUNITY SCHOOLS CONFERENCE 2010 CHRIS JONES International Quality Standards for Community Schools 1.
New sources – administrative registers Genovefa RUŽIĆ.
Indicators to Measure Progress and Performance IWRM Training Course for the Mekong July 20-31, 2009.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
ACCOUNTABILITY AND MONITORING IN HEALTH INITIATIVE PUBLIC HEALTH PROGRAM Capacity Building Consultation 5 – 7 June, 2012 Istanbul, Turkey.
Being Strategic Annette Lees. Strategy is: The essential link between vision and outcome The internal logic that links all parts of our work Both thinking.
NCES Update National Forum on Education Statistics Summer Meeting July 2013 Jack Buckley Commissioner, National Center for Education Statistics.
CEDS Update What is CEDS? Who and How? Version 2 Update and Scope Public Comments Version 3.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Foodborne Campylobacteriosis in New Zealand Evidence rather than prejudice Donald Campbell Principal Advisor (Public Health) Science Group.
Module Science-Based Road Safety Research Describe and value science-based road safety research and its application as fundamental to achieving further.
DCAS “Standard Setting” Appoquinimink School District Board of Education December 14, 2010 Odessa, DE.
Development of Education Quality Management System Ričardas Ališauskas Ministry of Education and Science of Lithuania Rotterdam.
Building DREW R. David Lankes and Scott Nicholson Syracuse University School of Information Studies
JANUARY 8-9, 2013 BY DON IANNONE SOURCING OFFICE BUTLER COUNTY INTEGRATED DEVELOPMENT BUDGET PROJECT.
1 CREATING AND MANAGING CERT. 2 Internet Wonderful and Terrible “The wonderful thing about the Internet is that you’re connected to everyone else. The.
Common Education Data Standards (CEDS) and ED Facts …What’s Next Jack Buckley, Commissioner – NCES Ross Santy, Director – ED Fact s U.S. Department of.
Summer Conference Wiki sion.wikispaces.com/ Fay Gore & Michelle McLaughlin
LISA A. KELLER UNIVERSITY OF MASSACHUSETTS AMHERST Statistical Issues in Growth Modeling.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
e-marking in large-scale, high stakes assessments conference themes :  role of technology in assessments and teacher education  use of assessments for.
CEDS: Why? What is CEDS? voluntarycommon A national collaborative effort to develop voluntary, common data standards for a key set of education data.
Feedback from Interoperability Workshop: Graham Worsley and David Calder TSA Conference 15 November 2011.
Minnesota’s Promise World-Class Schools, World-Class State.
11 PIRLS The Trinidad and Tobago Experience Regional Policy Dialogue on Education 2-3 December 2008 Harrilal Seecharan Ministry of Education Trinidad.
Common Education Data Standards: Version 2 Update Tate Gould, NCES Beth Young, QIP.
Please fill in my session feedback form available on each chair. SPSCairo Welcome.
Introduction: Thinking Like an Economist 1 CHAPTER 2 Measuring the Aggregate Economy The government is very keen on amazing statistics…They collect them,
The Advocacy Capacity Tool: For Advocacy Evaluation
2016 Tokelau Census of Population and Dwellings
Program Evaluation Essentials-- Part 2
Scott Marion, Center for Assessment
District Test Coordinators Training
GA’s SPDG Lessons Learned
Quality Management Anita Téringer– ITS Hungary
Presentation transcript:

Delivering on the Promise of Data in Education Winter Forum February 2013 Jack Buckley Commissioner National Center for Education Statistics

A Really Simple Theory of the Tradeoff Between Quality and Accountability Accountability Pressure Data Quality What we wish were true How it is

Accountability Pressure Data Quality The “Chowky Dar” Region: Increased Accountability Pressure Improves Quality

Chowky Dar? "The government are very keen on amassing statistics. They collect them, add them, raise them to the nth power, take the cube root and prepare wonderful diagrams. But you must never forget that every one of these figures comes in the first instance from the chowky dar (village watchman in India), who just puts down what he damn pleases.” –Josiah Stamp No accountability = no incentive for accurate reporting Some accountability (even “soft”) can yield improvements in quality through reduction of “noise” Example: Last year’s U.S. News and World Report high school rankings and the quality of school-level CCD data

Accountability Pressure Data Quality The High-Stakes Region: Too Much Pressure Can Distort the Data

High-Stakes Here the threat is not usually noise—it’s intentional misreporting. As the stakes get higher, a percentage of individuals involved in the system will subvert it. Example: the intentional cheating by some Atlanta Public Schools teachers and administrators on the Georgia state assessment. Some ways to reduce accountability pressure are sampling (NAEP versus state tests) or confidentiality through aggregate reporting, but not an option for every application.

Accountability Pressure Data Quality Can We Get to the Middle? Only if We Control How the Data Are Used. Good Luck with that.

Accountability Pressure Data Quality Q: If Policy Makers Set the level of Accountability Pressure, what Can We Do? A: Find Ways to Shift the Whole Curve Up

So How Can We Shift the Curve? Cooperation, training, technical assistance, sharing best practices. In other words, the Forum and related efforts. Improvements in collection technology (broadly defined) can help—SLDSs, better edit checks, smarter tools, automation, integrated systems, methods for detecting distortion, common data standards. This is the basic strategy behind NCES’s activities in administrative data: Build a strong community, invest in better systems, develop common standards, and improve our tools and technology. In short, if we can shift the curve, data quality can theoretically be improved at all levels of accountability.

How NCES is Shifting the Curve Today and Tomorrow CEDS V 3.0 end of January On time and almost under budget Continued development work on CEDS Align and Connect Tools Assisting OCR with redesign of the CRDC collection tool Working with EDFacts and state partners to integrate from CEDS to data groups Maintaining and improving our Forum, MIS/Summer Data, SLDS conferences to work with the field Working with our partners to reexamine and improve the back end of collection systems like CCD, CRDC, EDFacts, and IPEDS.

But Data Quality Is not the Only Challenge to Usefulness Some uses of the data are obvious and benefit obviously from quality improvement—simple descriptive statistics, monitoring and early warning systems, feedback reports. But other uses—prediction, causal inference, evaluation—can only be improved so much through shifting the curve. Even with perfectly measured data, there are many threats to valid inference. Large scale IT systems, shiny technology, “Big Data,” aren’t substitutes for principles of careful scientific research design. There are no easy answers to difficult questions. The surest way to end the new era of data in education is through overpromising and large-scale failure. “Those who ignore Statistics are condemned to reinvent it.” –Brad Efron.