Shiawassee County’s Data Project. October 25, 2007AESA Why a Data Project? Student achievement is the goal of all school districts. Resources (time &

Slides:



Advertisements
Similar presentations
Michigan Department of Education School Improvement Plan SIP February 2011.
Advertisements

SHELLY POTTS, PH.D. UNIVERSITY OFFICE OF EVALUATION AND EDUCATIONAL EFFECTIVENESS ARIZONA STATE UNIVERSITY Quality Assurance Mechanisms.
Using Data to Support Statewide initiatives centered on Student Achievement A look at publically available data for use by RSA’s, Districts, and schools.
Hillsdale County’s Data Director Initiative: ‘Growing Our Warehouse’ Implementation Plan August 4, 2010 Pat Dillon and William Yearling Hillsdale ISD Jennifer.
Goals of Title II, Part D of No Child Left Behind The primary goal of this part of NCLB is to improve student academic achievement through the use of technology.
Edward S. Shapiro Director, Center for Promoting Research to Practice Lehigh University, Bethlehem, PA Planning for the Implementation of RTI: Lessons.
Getting Organized for the Transition to the Common Core What You Need to Know.
Massachusetts Department of Education EDUCATOR DATABASE Informational Sessions Overview: September 2005 Web:
New Mexico Principal Support Network Helping Leaders Use Accountability Data Effectively.
Data Disaggregation: For Data Driven Decision Making By Ron Grimes: Special Assistant to the Assistant Superintendent Office of Career and Technical Accountability.
Maine Course Pathways Maine School Superintendents’ Conference June 24 – 25, 2010.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Race to the Top Program Update January 30, State Funding 2.
One Common Voice – One Plan School Improvement Module: Plan
The Initiative for School Empowerment and Excellence (i4see) “ Empowering teachers, administrators, policy makers, and parents to increase student achievement.
Georgia Department of Juvenile Justice from Federal Compliance to Accreditation by the Southern Association of Colleges and Schools.
American Recovery and Reinvestment Act Enhancing Education through Technology Grant: Improving Instruction Through Regional Data Initiatives.
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
Aligning Academic Review and Performance Evaluation (AARPE)
Data Warehouse Overview From Calhoun ISD - PowerPoints Data Director and D4SS “Instead of overloading teachers, let’s give them the data they need to conduct.
Southern Regional Education Board HSTW An Integrated and Embedded Approach to Professional Development and School Improvement Using the Six-Step Process.
Data for Student Success Regional Data Initiative Presentation November 20, 2009.
1 Executive Limitation 12: Curriculum and Instruction Darlene Westbrook Chief Academic Officer Denise Collier Executive Director for Curriculum Monitoring.
Evaluation 101: After School Programs February 1, 2007 Region 3 After School Technical Assistance Center Conference.
Leadership: Connecting Vision With Action Presented by: Jan Stanley Spring 2010 Title I Directors’ Meeting.
Guidance from the CSDE on SRBI Implementation May 14, 2010 CAPSS Assistant Superintendents’ Meeting Mary Anne Butler, Education Consultant Iris White,
Overview of School Improvement Process
Committee of Practitioners ESEA Flexibility Waiver Review June 25, 2014.
Kindergarten Individual Development Survey (KIDS) District 97 pilot involvement December 11, 2012.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
School Improvement Planning Today’s Session Review the purpose of SI planning Review the components of SI plans Discuss changes to SI planning.
Technology Use Plan Bighorn County School District #4 Basin / Manderson, Wyoming “Life-long learning through attitude, academics, and accountability.”
Title I Schoolwide Planning Comprehensive Needs Assessment Wednesday, October 24, 2012.
MA Educational Data Warehouse Project MADOE and School District Collaboration Maureen Chew MADOE MA Digital Govt Summit 12/11/2007.
Professional Learning Communities “The most promising strategy for sustained, substantial school improvement is developing the ability of school personnel.
One Common Voice – One Plan School Improvement Module 2
Governing Board Meeting September 29, 2011 Annual State of the School Address Mr. R. Hackler, Principal.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Building Assessment Literacy in Michigan through Quality Common Assessment Development.
Data for Student Success Using State Assessment Data to Identify School Improvement Goals Lani Seikaly Professional Development Coordinator Data for Student.
STARTALK: Our mission, accomplishments and direction ILR November 12, 2010.
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
Data Driven Decision Making Across All Content Areas WI PBIS Network Summer Leadership Conference Rachel Saladis Lynn Johnson The Wisconsin RtI Center/Wisconsin.
Professional Development Opportunities for the New Math Standards.
Using ASSIST to Complete the Indiana School Improvement Plan March 1, 2016.
Preparing for Advanced Tiers using CICO Calvert County Returning Team Summer Institute Cathy Shwaery, PBIS Maryland Overview.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Texas STaR Chart School Technology and Readiness.
PROGRAM Perkins III Accountability and Continuous Improvement “Work in Progress” at Minnesota State Colleges and Universities Mary Jacquart Minnesota State.
BISD Update Teacher & Principal Evaluation Update Board of Directors October 27,
Anne Arundel County Public Schools Intro to Online Data Systems For AP’s Oct 27 th and 28th, 2010 Mike Ballard Cathy Gillette.
NCLB – Can You Afford Not To Have A Data Warehouse Bill Flaherty Director of Technology Services Hanover County Public Schools
Hudson Area Schools - Lincoln Elementary School Improvement Professional Development Friday, February 26, 2010.
SACS/CASI District Accreditation  January 2007  April 2007  May 2007  January – April 2008  Board Approval for Pursuit of District Accreditation.
+ SOUTH DAKOTA PRINCIPAL EFFECTIVENESS MODEL PROCESS OVERVIEW PE WEBINAR I 10/29/2015.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Broward County Public Schools BP #3 Optimal Relationships
2 ASSIST for Schools/Districts An Overview of the Framework Dr. W. Darrell Barringer.
Consortium for Educational Research and Evaluation– North Carolina Building LEA and Regional Professional Development Capacity First Annual Evaluation.
The Standards-based IEP Process: What You Need to Know Standards-Based IEP State-Directed Project - January 2011.
Welcome to the (ENTER YOUR SYSTEM/SCHOOL NAME) Data Dig 1.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
Long Range Technology Plan, Student Device Standards Secondary Device Recommendation.
NAEP What is it? What can I do with it? Kate Beattie MN NAEP State Coordinator MN Dept of Education This session will describe what the National Assessment.
Welcome to Home Base You can also re-emphasize that all of these benefits can be found using one login all in one place, saving teachers more time to focus.
Overview of Title III Plan, Data, and Review of Specially Designed Academic Instruction in English (SDAIE) for K-12 Administrators Session 1 Local District.
MD Online IEP System Instructional Series – PD Activity
Implementing Race to the Top
Aligning Academic Review and Performance Evaluation (AARPE)
Presentation transcript:

Shiawassee County’s Data Project

October 25, 2007AESA Why a Data Project? Student achievement is the goal of all school districts. Resources (time & money) are limited. NCLB & Education YES! have created a high stakes situation for schools. The use of data to make informed decisions is more crucial than ever.

October 25, 2007AESA Student Achievement is the Goal

October 25, 2007AESA What are our Needs? In 1995, Shiawassee County Schools determined they had a need and desire to enter into a countywide assessment project. As a result, a county assessment committee was formed and a search conducted for a common assessment. In 1995, Shiawassee County Schools determined they had a need and desire to enter into a countywide assessment project. As a result, a county assessment committee was formed and a search conducted for a common assessment. In 1997, on-line testing began. Seventy-five percent of the districts in the county participated in the project. In 1997, on-line testing began. Seventy-five percent of the districts in the county participated in the project.

October 25, 2007AESA Since 1997… An on-line testing system was used countywide multiple times a year every year. An on-line testing system was used countywide multiple times a year every year. K-5 testing requirements were instituted by the state. K-5 testing requirements were instituted by the state. NCLB was created and implemented. NCLB was created and implemented. Shiawassee County schools expanded their assessment capabilities and activities as well as their data use and analysis abilities. Shiawassee County schools expanded their assessment capabilities and activities as well as their data use and analysis abilities.

October 25, 2007AESA As Our Expertise Has Grown… One year of Wahlstrom work/study provided a background in the multiple types of data (outcome, demographic, process, and perception). Three years of Data Packets have been used to make limited school improvement decisions. Trainings & experiences took place using a variety of data for decision making purposes (Savy with SASI, Test Wiz, School Improvement Planning Days, etc.). County Assessment Committees were formed & their results analyzed which further stressed the need for data based solutions. Local Service Planning results found county consensus that data organization and analysis was a need. County Assessment Survey and CCIC Surveys discovered districts were looking for similar attributes in a data system.

October 25, 2007AESA What do schools need to use data effectively …  as a means to monitor a program’s impact on student achievement  to identify the most critical opportunity areas to focus school improvement efforts? The Question Now Is…

October 25, 2007AESA A tool to help districts become data driven in order to meet the requirements of NCLB and Ed YES! A collection of various sets of data found in a variety of unrelated locations and formats brought into one relational database. A system that will allow districts to find answers and ask complex questions that uncover underlying problems – leading to the design of data driven student achievement and school improvement strategies. A program that will incorporate data into a fully relational data warehouse that includes: –Financial data –Personnel data –Building infrastructure data –Student demographic data –Student achievement data –Assessment data and answers a variety of diverse and interactive questions easily. A Data Warehouse

October 25, 2007AESA Many Programs are Data Mining Tools Assessments –Most of these tools do not contain data from other sources beyond student demographics. Student Information Systems –These packages were not designed to link data from multiple years with assessment and special program data, or even teacher data. Document Storage –There is no association to student, teacher and assessment data in order to identify areas to target for school improvement. They address the following in isolation:

October 25, 2007AESA We have experience with data mining (through MAPS, Data Packets, etc.). We are ready to move to the next level. Our local districts all agree an interactive data warehouse is what would best meet their needs. The Answer…

October 25, 2007AESA Data Warehouse Timeline Academic Year SRESD staff attended multiple vendor demonstrations throughout the state September 2005 RFI requirements and criteria established September – October 2005 The opportunity to submit an RFI was given to: Achieve! Data SolutionsChancery SMSCompass CRESTCRMdataMetrics Software, Inc. Edmin.com, IncEdsmartEnterprise Computing Service, Inc. eScholar LLCExecutive Intelligence, Inc.IBM Just 5 ClicksKent ISDMI Tracker Midwest Educational GroupNational Study of School EvaluationPearson School Systems Performance MattersPlatoQSP Regional Data ServicesRiverdeep, Inc.Sagebrush Corp. SCHOLARincSchoolCity, Inc.Schoolnet, Inc. School Interopability FrameworkSkywardSwiftknowledge, Inc. TetraData Corp.TurnLeaf

October 25, 2007AESA September – October 2005 RFI received from: Achieve! Data Solutions, LLCEdmin.com, Inc. EdsmarteScholar LLC Kent ISDMidWest Educational Group Pearson School SystemsSagebrush, Corp. SchoolCity, Inc.TetraData Corp. November 1- 3, 2005 Prescreening occurred – RFI eliminated: Edmin.com, Inc.Kent ISDMidWest Educational GroupSagebrush, Corp. November 10, 2005 RFI Committee Review of candidates by county data project committee consisting of curriculum directors, technology experts, principals and teachers: Achieve! Data SolutionsEdsmart eScholar, LLCPearson School Systems SchoolCity, Inc.TetraData Corp. Data Warehouse Timeline continued

October 25, 2007AESA Training Requirements Speed and Efficiency of Data Easy to Query Drill Down Capabilities Longitudinal Data Capabilities (consider # of years as well as ability) Formats (charts, graphs, etc.) (Graphs in system, including longitudinal, & do not require export) Pre-formatted Reports Export Capabilities Web Based Multiple Levels Importability Proven Data Inputs (data elements are cited that are supported by research) Customized Fields Customized Reports/Flexibility Student Work Tracked Michigan Curriculum Frameworks/GLCE Addressed SIF Compliant Testing Capabilities Achievement Data Demographic Data Process Data Perception Data Support Critical Important Bonus RFI Committee Review Criteria

October 25, 2007AESA Narrowing the Field November 10, 2005 Final two vendors chosen for in-house demonstrations: Achieve! Data Solutions TetraData Corp. Demonstration requirements and score criteria for the demonstration were determined by the county data committee project committee. December 6, 2005 In-house demonstrations by Achieve! Data Solutions and TetraData Corp.

October 25, 2007AESA The Process Vendors received demonstration requirements: Demonstration for users at six different levels: ISD, Superintendent, Curriculum Director, Principal, Teacher, Parent. Samples of queries, pre-formatted reports (drilled down to individual student level), charts and graphs; creation of non-preformatted queries and reports; different types of data “interaction”; a “live” data import. Examples of training model, including content and timing (timeline for set up, data upload and implementation) as well as an explanation of technical and user support available. Explanation of all costs (including: components needed, technology costs, start up costs, annual costs, consultation costs, and training/support costs). Explanation of the frequency of updates (how often is new data added to the warehouse and when will that data “show up” in reports/queries).

October 25, 2007AESA SRESD district participants see vendor demonstrations and rate the products based on the following criteria: Ease of use Charts & Graphs Drill down ability Pre-formatted reports Pre-formatted queries Creation of customized reports Creation of customized queries Interaction of process & perception data with achievement & demographic data Data upload speed & efficiency Training Initial setup timeline Technical support User support Frequency of updates Demonstration Criteria

October 25, 2007AESA The Final Selection

October 25, 2007AESA Data Framework for Continuous Improvement Enrollment, Mobility, Attendance, Drop- out/Graduation Rate, Ethnicity, Gender, Grade Level, Teachers, Language proficiency Perceptions of Learning Environments, Values and Beliefs, Attitudes, Questionnaires, Observations Standardized Tests, Norm/Criterion Referenced Tests, Grade point, Formative Assessments Demographics Student Learning School Process Perceptions Gaining active insight by analyzing data to improve learning for all students. Copyright © Education for the Future Initiative, Chico, CA Programs, Instructional Strategies, Classroom practices, Assessment Strategies, Summer School, Finance, Transportation Information Foundation 2005 TetraData Confidential

October 25, 2007AESA Analyzer and DASH 2 components of the warehouse Analyzer allows for flexibility of complex reports: comparing variables, longitudinal reports, looking at trends, etc. DASH provides a snapshot of an identified issue.

October 25, 2007AESA 2005 TetraData Confidential

October 25, 2007AESA 2005 TetraData Confidential

October 25, 2007AESA

October 25, 2007AESA

October 25, 2007AESA Building a Warehouse Step 1 – Data Discovery –Defining & acquiring data to be included in the data warehouse Step 2 – Mapping –Mapping data from source system(s) to the data warehouse –Aligning various data elements into common folders Step 3 – Engineering –Building the warehouse & adding the data Step 4 – Quality Assurance –Querying in the warehouse to determine if the data is mapped & loaded accurately Step 5 – Implementation –Using the warehouse to make data decisions

October 25, 2007AESA Available Data in Warehouse Right Now Achievement Data MEAP Grades GPA Courses Credits Teachers Student Data Subgroups Lunch Status Special Education Language Ethnicity Discipline Attendance Process Data Title One Programs Extra Curricular Activities Programming

October 25, 2007AESA Two Distinct Uses: Summary Reports – Annual/semi-annual long-term results, after instruction Monitoring Reports – On-going, check of progress

October 25, 2007AESA Welcome to Our Warehouse

October 25, 2007AESA Using Our Warehouses Summer School Intervention Identification Annual Report Achievement Trends Professional Development Planning Regionally based on Student Achievement (Ds and Fs) District and Building Profiles Program Evaluation NCA Goal Identification and Monitoring Responses to Board Questions Grant Applications (Special Education, Writing, Math, CASM, etc.)

October 25, 2007AESA Professional Development Then, Now, and in the Future (Based on Zoomerang Survey) Five Day Analyzer Trainings – All 8 districts have been involved Local District and Board Overview Presentations Dabbling with Data Sessions RtI and Program Evaluation Office Professional Training (2 Days) Report of the Month Specialized Group Trainings Data Ambassador Training Half Day Specialized Skill Ongoing Trainings Annual Report, NCA Goal, Profile Update Topic Work Sessions Etc, Etc, Etc…

October 25, 2007AESA

October 25, 2007AESA Questions