John Kraman, Oklahoma SDE Nancy J. Smith, DataSmith Solutions Thursday, 2/14/2013 Oklahoma Data Pipeline Project Needs Assessment Survey: Process and Findings.

Slides:



Advertisements
Similar presentations
Data Quality Considerations
Advertisements

Community Outreach that Wins Worthington City Schools Communications Audit Key Takeaways Community Outreach that Wins.
NCLB Monitoring NCLB/ESEA Cycle Legal Requirement: EDGAR Part 80.40(a)—monitor subgrant activities to assure compliance with applicable.
Don’t Waste My Time: Here’s Why Our Data Look Bad and What We Really Need to Improve the Quality of the Data NCES MIS Conference 2012.
Data for Action: How Oklahoma Is Efficiently Expanding Data Tools 26th Annual Management Information Systems [MIS] Conference Concurrent Session VIII Thursday,
Welcome to your wellness program
The Application for Renewal Accreditation: Electronic Submissions.
Data Pipeline to Data Use The Conference 2013 August 23 rd John Kraman, Executive Director Office of Student Information (SDE)
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Ramiro Nava and Stacy Savoca Federal Program Monitoring Office.
Surfing the Data Standards: Colorado’s Path 2012 MIS Conference – San Diego Daniel Domagala, Colorado Department of Education David Butter, Deloitte Consulting.
Learn How States Are Finding “Hard-to-reach” Students for Post-school Outcome Data Collection! How the Heck Do We Contact Some of Our Former Students?
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Civil Rights Data Collection (CRDC) SY Schedule Update Current Status: About 6 and 1/2 weeks behind schedule What does this mean? ◦ Pilot occurs.
6 th Annual Focus Users’ Conference Texas Reporting Presented by: Bethany Heslam.
Iowa Transcript Center File Format Validation Process.
Employment and Training Administration DEPARTMENT OF LABOR ETA Reporting and Data Validation Updates Presenters: Wes Day Barbara Strother Greg Wilson ETA’s.
Initiative for School Empowerment and Excellence i4see – Performance Pathways Follow The Child Assistance Center -- Awareness Training -- The Initiative.
1 Boundary and Annexation Survey (BAS) Laura Waggoner Legal Areas Team Lead Boundary and Annexation Survey Project Manager Michael Clements Geographer.
SWIS Digital Inspections Project (SWIS DIP) Chris Allen, Information Management Branch California Integrated Waste Management Board November 5, 2008 The.
MDECA SECURITY UPDATES Update & Review for Security Changes!
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Physical Fitness Test (PFT) Program Update 2013 Assessment.
Evaluation 101: After School Programs February 1, 2007 Region 3 After School Technical Assistance Center Conference.
Update: Web Data Collection System (WDCS) Title I Administrative Meeting—September 30, 2010 Kristi Peters, Research and Evaluation Coordinator 1.
Quality assurance activities at EUROSTAT CCSA Conference Helsinki, 6-7 May 2010 Martina Hahn, Eurostat.
1 Welcome to HSPA Online. 2 Measurement Incorporated Partnering with NJDOE & New Jersey School Districts.
Hans P. L’Orange State Higher Education Executive Officers October 20, 2009.
1 Department of Medical Assistance Services Stakeholder Advisory Committee June 25, 2014 Gerald A. Craver, PhD
EMIS Basics David Ehle, Director Office of Data Quality and Governance December 13, 2013.
Data Pipeline to Date Use 2013 CCOSA Summer Conference Norman, OK June 6, 2013 John Kraman, Executive Director Office of Student Information (SDE)
4483D ENROLLMENT REPORT PTD Technology. WHAT’S NEW THIS FALL? Manage Enrollment –Screens updated for faster, more efficient data loading. –Enrolled Students.
The Initiative For School Empowerment and Excellence (i.4.see) “Empowering teachers, administrators, policy makers, and parents to increase student achievement.”
Monitoring and Evaluating SES Provider Programs
District Certification of Student Level Data Technology Training Seminars February-March 2013.
New EMIS Coordinator Training 12/13/13 Presenters: Brenda Hartley – OMERESA EMIS Support Tammy Hrosch – MDECA EMIS Services Manager.
Data Pipeline May 10, 2013 Lisa Bradley IMS, Project Manager
Leadership Advisory: 2014 Data Review & Process John Kraman, Executive Director OSDE Office of Student Information January 24, 2014 Francis Tuttle Tech.
Spring 2014 Updates OSDE Office of Student Information Autumn Daves, Data Governance and Technical Project Coordinator Susan Pinson, SLDS District Data.
NCLB Monitoring September 19, 2012 Webinar.
Idaho System For Educational Excellence ISEE Update ISSA and IETA Conferences February 7 &
Child Find (Indicator 11) Colleen Stover / Steve W. Smith 2009 COSA Conference October 2009 Meeting the 60 School-Day Requirement for Initial Evaluations.
SWIS Digital Inspections Project Chris Allen, Information Management Branch California Integrated Waste Management Board August 22, 2008.
K-12 Data Standards in Education  4 th Annual Conference on Technology and Standards 1 K-12 Data Standards and the U.S. Department of Education Accomplishments.
LDS SIF Pilot Update STATS-DC 2012 Data Conference July 13, 2012.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
LEP SYSTEM REQUIRED ANNUAL DATA COLLECTION SYSTEM FOR SD/CS/AVTS/CTC Presenter: Barbara Mowrey ESL/Bilingual Education Advisor Pennsylvania.
A Regional Service Agency Success Story 26 th Annual MIS Conference February 14, 2013 Automating State Reporting Contact Us 100 Executive Drive Marion,
SIF Data Collection 101 Melissa Marino Data Collection Supervisor October 2014.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Julie Rodgers Smith, M.S. Director of Public Private Partnerships.
Measuring the Power of Learning.™ California Assessment of Student Performance and Progress (CAASPP) Accessibility and Accommodations December 16, 2015.
Presented by Special Education Data Unit Jackie McKim.
EDFacts The Future of the Education Data Exchange Network (EDEN) CCSSO Data Summit November 17, 2005 U.S. Department of Education.
Update: Oklahoma Student Information John Kraman March 26, 2013.
Aiming for the Right Target A guide to effective communication for financial aid students.
Parent Involvement Survey February Consolidated Planning & Monitoring Brinn Obermiller Family Engagement and School Improvement Coordinator.
A Year in the Life Of a State Data Coordinator December, 2013.
Leveraging Statewide Longitudinal Data Systems for EDFacts Reporting Joel McFarland, U.S. Department of Education Ross Lemke, AEM Corporation Bob Beecham,
CRDC Lessons Learned Key Lessons LearnedChanges to the CRDC System CRDC data may be stored in separate systems, maintained by different.
Education Management Information System Redesign for Longitudinal Data (EMIS_R) Presented by: Office of Information Policy and Management November 18,
Required Data Files Review
Student Biographical Data (SBD) Training
Office Of Civil Rights Data Collection (OCR / CRDC)
Teacher Student Data Link (TSDL)
Electronic Data Collection at Statistics Canada
SPR&I Regional Training
Data Pipeline to Data Use
Student Biographical Data (SBD) Training
Annual Title I Meeting and Benefits of Parent and Family Engagement
SPECIAL EDUCATION DECEMBER COUNT
April 25th Town Hall Purpose
Presentation transcript:

John Kraman, Oklahoma SDE Nancy J. Smith, DataSmith Solutions Thursday, 2/14/2013 Oklahoma Data Pipeline Project Needs Assessment Survey: Process and Findings 26th Annual Management Information Systems [MIS] Conference

Oklahoma Landscape 670,000 overall enrollment 537 districts Range between 40 and 40,000+ in enrollment State law forced switch in 2011 to a centralized IT OMES versus SDE WAVE, EWIS, EDFacts, upcoming projects 2

Needs Assessment Survey Electronic survey of all districts Disseminated by SDE and CCOSA Open September – December 2012 Six Sections: – Technology – Data standards and documentation – Data-related training – Data governance – Communication – Financial – staff, resources, costs 3

Follow-up Engagement Voluntary participation via – Follow-up phone calls – Follow-up s – In-person focus groups Two focus groups in late November – Included superintendents and technology staff – In-person in Oklahoma City, not at SEA 4

Participation in Survey Responses from 259 of 537 districts (48.2%) 184 provided district name 173 distinct districts (multiple responses from 11 districts) 108 superintendents (out of 175 who provided role or title) District Size of 173 districts: – = 72 respondents – 501-1,000 = 44 – 1,001-5,000 = 44 – 5,001-10,000 = 7 – >10,000 = 6 5

Technology Trouble uploading or submitting files: – WAVE: 62% sometimes or frequently (122) – EDFacts: 44% sometimes or frequently (108) Problems center on – Confusing error messages (65%, n=122) – Delays with submission (51%, n=95) – Data or file correction process (65% n=155) Areas that rarely or never cause problems – data element formatting (19%, n=36) – file size limits (15%, n=29) – interoperability standards and processes (12%, n=23) 6

Data Standards and Documentation Clarity of documentation about – File submission: 38% good or excellent (77) – File formatting: 36% good or excellent (73) – File due dates: 36% good or excellent (83) Usefulness of documentation about – File submission: 53% useful or very useful (107) – File formatting: 52% useful or very useful (104) – File due dates: 53% useful or very useful (107) Ease of access to documentation about – File submission: 34% easy or very easy (68) – File formatting: 34% easy or very easy (67) – File due dates: 33% easy or very easy (67) 7

Data Related Training SufficientNot SufficientWant More Information Data privacy and confidentiality 55.9%28.0%23.7% Data security File creation File submission Data element format & definition Data access management Data reports and analysis Checking data quality or accuracy Data sharing processes and agreements Data/File correction process Data exchange with other districts

Data Governance 95% of respondents indicated that they don’t participate in SDE data governance or advisory committees (190) 76% are unaware of SDE data governance activities (151) 63% indicated their districts have designated data stewards responsible for specific elements (124) 43% indicated that their districts have a designated data coordinator (83) 9

Communication re: Data Requirements 56% find current communication informative (107) 52%: helpful (99) 51%: disseminated to right people (95) 43%: clear (82) 41%: frequent enough (78) 37%: detailed enough (71) 34%: timely (66) 10

Financial and Resources Biggest concerns Enough staff to manage collections82% (154) Time & resources for file creation,68% (124) validation & submission Data quality62% (115) Sustaining resources for district SIS61% (112) 11

Financial and Resources “What services or resources do you wish SDE could provide to reduce your costs?” 44%: Improved access to SDE data and reports (n=83) 38%: A statewide SIS (n=71) 35%: Improved interface or portal for use with file uploads (n=65) 12

Synopsis Better management of existing processes and documentation Better communication, specifically about changes to data requirements, new tools, and upcoming plans Fewer last minute changes to collections Do a few things well rather than trying to do lots of big changes at once More transparency about processes and governance More engagement from field to ensure process and communication management meets LEA needs and understanding Partnership and clear definition of roles and responsibilities between SDE, OMES and LEAs 13

SDE Response and Plans Data Governance Committees Update roles and responsibilities between SDE and OMES More internal capacity around data at SDE ??? 14