Download presentation
Presentation is loading. Please wait.
Published byPrudence Wood Modified over 8 years ago
1
Hudson Area Schools - Lincoln Elementary School Improvement Professional Development Friday, February 26, 2010
2
School Improvement at Lincoln Elementary Information and background on school improvement at Lincoln Elementary ~ where we’ve been ~ where we want to be
3
3 School Improvement Process… One Common Voice – One Plan School Improvement Planning Process Do Implement Plan Monitor Plan Evaluate Plan Plan Develop Action Plan Study Analyze Data Set Goals and Measurable Objectives Research Best Practice Student Achievement Gather Getting Ready Collect Data Build Profile
4
4 One Common Voice – One Plan School Improvement Process Getting Ready Collect Data School Data Profile School Process Profile Build Profile Analyze Data School Data Analysis School Process Analysis Summary Report Set Goals & Measurable Objectives Research Best Practice Develop Action Plans Implement Plan Monitor Plan Evaluate Plan Comprehensive Needs Assessment School Improvement Plan
5
5 One Common Voice – One Plan Creating a Common Vocabulary for School Improvement A comprehensive needs assessment* includes five components…. 1. School Data Profile 2. School Data Analysis 3. School Process Profile 4. School Process Analysis 5. Summary Report *Comprehensive needs assessments may vary, however, in Michigan, the School Process Profile must include one of four designated options. Common Vocabulary
6
6 One Common Voice – One Plan Michigan Comprehensive Needs Assessment In Michigan, all schools must complete one of these four School Process Profile options: School Process Rubrics (90) or EdYes! Subset (40) or Standards Assessment Report (SAR) or Self Assessment (SA) MDE NCA
7
The Profile The profile summarizes your school/students through clear and compelling data. Provides a rich and accurate description of reality Is a living document that continuously unfolds Influences decisions, efforts and actions 7
8
Use the Profile to… Engage in data-driven decision-making Identify & validate strengths Identify & validate challenges Illustrate patterns and trends Archive information 8
9
One Common Voice – One Plan Gather: Collect Data Demographic Data Student Learning Data School Process Data Perceptions Data Student Gender Grade Age Ethnicity Race Conduct Attendance Socio-Economic Status Free/Reduced Lunch Parent Involvement Teacher Degree(s) Certification Years of Experience Transience Professional Affiliation Assessments for and of learning Local classroom assessments Statewide assessments National assessments Interim assessments Dropout rates College acceptance rates School Programs Current Offerings Instructional Practices Professional Development Professional Affiliations Extra-Curricular Activities Budget Allocations Facility Usage Technology Distribution Staffing Patterns Supervision Practices Policies and procedures School process rubrics (40 or 90) or SA/SAR (NCA) Teachers & Staff Course Offerings School Culture Success Collective Efficacy Norms School Quality Teacher Quality Staff Commitment What do you already know? What data do you need to know? What additional information/data do you need to know? Where can the information/data be found?
10
10 One Common Voice – One Plan Gather: Collect Data Types of Data Demographic Data: Describes our students, staff, district, and community Student Learning Data: How our students perform academically on federal, state, and local assessments School Process Data: Disciplinary Information, Policies and Procedures, School Process Rubrics Perception Data: Survey Data, Opinion
11
Data Mining Hudson Lincoln Elementary February 26, 2010
12
The “veins” we are mining Statewide - –student learning on a limited number of the GLCE Interim – –student learning on a limited number of learning targets Classroom – –student learning on the taught curriculum
13
Levels of Student Learning Assessments “X” represents opportunities Fall 2008 Fall 2009 Fall 2010 Statewide Assessment (MEAP) XXX Interim Assessment (DIBELS, DRA, STAR) XXXXXXXX X Classroom Assessment (Unit tests, common writings with rubrics) XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX
14
Words of encouragement from other assessments miners… All assessments are finite resources Having a variety of tools makes your mining experience more worthwhile It is better to work together while in the mines!
15
FERPA Questions and Answers Lenawee Data Camps June and August, 2009
16
To be considered an “education record,” information must be maintained in the student’s cumulative or permanent folder. False, because any record that has a student name is an educational record.
17
You are in charge of a staff meeting to study student achievement on school improvement goals. As part of your meeting, you are showing a report to the entire staff that shows student test scores on a statewide assessment. The report shows the student names. In addition, you have given them a paper copy of the report. It is a violation of FERPA to display the results of the assessment to the entire staff. The exception would be a group of teachers working on a specific student strategies as they are a specific population that then has a “legitimate educational interest” in the information.
18
What’s are looking for in the mines today? Statewide - MEAP –Overview of each grade level performance –Individual student performance –Grade Level Strand and GLCE analysis –Review of 08-09 Goals with Fall 2009 Data Interim – DIBELS, DRA, STAR –Scores –Benchmark targets Classroom – Unit Tests, Common Writings –Scores –GLCE Alignment
19
Goal Teams - Data Review & Discussion… Organize into K-5 Goal Teams - Math WritingReading
20
Goal Teams – Data Review & Discussion Guiding Questions… What subgroups are not performing successfully, according to our data? What factors may lead to differences between successful and unsuccessful students? taught curriculum program participation transition / “school of choice” students demographics How effective are the school improvement strategies in assisting students to become successful? What are the relationships between the levels of student learning assessments? (statewide, interim, and classroom)
21
Time for Lunch!
22
Grade Level Teams – “Jigsaw” Report Out Organize into grade level teams Content area team members report out on data discussion for their specific grade level Use guiding questions to help frame your sharing and discussion
23
Grade Level Teams – “Jigsaw” Report Out Guiding Questions… Who are the successful and unsuccessful students at our grade level? What traits do they each have in the content areas? How might we develop strategies and assessments to create more successfulness at this grade level? When and how often should we study student achievement using assessment data in the future?
24
Whole Staff – Report Out & Sharing “3 – 2 – 1” *For each grade level 3 Three critical findings by using the data to add to the school data profile 2 Two potential improvement efforts to increase success 1 One data source that should be collected and analyzed in the future
25
Developing a Process Which Supports Ongoing School Improvement “Building a Calendar”
26
Wrap-Up…. Next Steps….. Reflection & Feedback on the Day!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.