Download presentation
Presentation is loading. Please wait.
Published byAndrew Dean Modified over 9 years ago
1
Leadership Academy “Data Analysis” June 2010
2
Some thoughts about data analysis… We have moved from a system where hunches and “cardiac data” were followed, to a system that is data driven. We have moved from assuming good things are happening to insisting good things are happening and proving this is so. Teaching is an art… …and becoming a science.
3
Learning Outcomes Participants will identify: –Key ideas about effective data-based decision making –Four types of data –Challenges within our district ITBS/ITED results –Findings & results from last year’s Leadership Academy –Effective strategies to help improve student achievement –Ideas to include within our revised Strategic Plan
4
8 Key Points 1. Data provides insight 2. Data are nothing without analysis. 3. Interpret graphs accurately 4. Improvement means more success, less failure, and less unwanted variation!
5
5. Real-time, frequent data are necessary to monitor improvement. 6. Data must be accessible to users. 7. To improve results, work on both product and processes. 8. Take action! 8 Key Points (cont.)
6
Leaders Ask Good Questions û What does the graph look like? û What patterns or trends do the data show? û Special or common cause? û Over-reacting or under-reacting?
7
“Is this the complete picture?” Are there other assessments we need to look at? What other data might we need to confirm, support and or clarify what we are seeing now?
8
How do I know if what I’m seeing is real? Try to use multiple data sources. Triangulation occurs if three different data sources all indicate the same thing. ITBS scores Unit test scores District test score
9
Why do some people dislike (fear?) data analysis? Data Share your thoughts with someone at your table?
10
Why do some people dislike (fear?) data analysis? Data analysis… unlocks the door to the traditionally private domain of the classroom, inevitably results in a mandate for change, and increases pressure for accountability, especially if results are poor. Data
11
What did we learn from last year’s Leadership Academy?
12
Global Positioning System (GPS) Mapping the Way
13
Few credits earned Proficient math scores Reading scores were low. Did not attend Estimate time of arrival… Commencem ent Day! St art he re 12th11th10th9th8th7th6th5th4th3rd2nd1st KPre K Write data in each arrow indicating your student's successes and/or challenges at each grade level. Student GPS Timeline Student dropped out Start of poor attendance Good attendance
14
Data Drills Dropouts Struggling Students Walkthoughs Perception Surveys
15
Four Types of Data Classroom Data ______________ _____ ______________ _____ ______________ _____
16
Findings & Results of Last Year’s Leadership Academy 1.Struggling students & dropouts are a K-12 concern. -“Intervention Record” piloted in four elementary schools. -Pinnacle Analytics set up to red flag struggling students & potential dropouts 2.Our data is inconsistent, poorly organized, and not effectively used to guide our instruction. -Pinnacle Analytics has been designed to: *Manage, organize & store our data, *Visually help us correlate & analyze our data, and *Help us work smarter, not harder.
17
Informing Our Strategic Plan Last year’s Leadership Academy findings were incorporated into our Strategic Plan. This year’s Leadership Academy findings will be as well.
18
Strategic Plan Goals Achievement Perception Demographic Classroom NEEDS, GOAL(S), FOR STRATEGIC PLAN Throughout the Leadership Academy, record relevant data along with corresponding needs and goals of our district that you believe should be included in our next revision of the Strategic Plan.
19
Student Achievement Data 1.What is our current reality? 2.Is it “good enough”? 3.What are we doing to cause this? 4.What will we do to maintain or improve?
20
Initial 2009-2010 ITBS/ITED Results
21
Trajectory District FAY Score Trend line
30
Trajectory District FAY Score Trend line 1.What is our current reality? 2.Is it “good enough”? 3.What are we doing to cause this? 4.What will we do to maintain or improve?
31
% Proficient Trajectory
32
1.What is our current reality? 2.Is it “good enough”? 3.What are we doing to cause this? 4.What will we do to maintain or improve?
35
FINAL District AYP Results of 2008-09 Iowa Tests Elementary Math Elementary Reading Middle Math Middle Reading High Math High Reading AllMissed Met - Safe Harbor Missed Low SESMissed IEPMissed Met - Safe Harbor Missed Met - Safe Harbor Missed ELLMissed Met - Safe Harbor Missed Met - Safe Harbor Missed African AmericanMet- GHSMissed Met - GSH Met - Safe Harbor AsianMissed Met - Safe Harbor Met Triennium Met - TrienMet Met Triennium HispanicMissed Met - Safe Harbor Missed Native AmericanMissed Met - Safe Harbor Missed WhiteMet Met - Trien Missed “Final” District AYP Results of 2008-09 Iowa Tests
36
1.What is our current reality? 2.Is it “good enough”? 3.What are we doing to cause this? 4.What will we do to maintain or improve?
37
We are moving to spring ITBS/ITED testing next year. How might this change help our current reality?
38
Since we will continue to be evaluated by the ITBS/ITED annual “snapshot”, how can we use our transition to spring testing to help us better prepare for the tests?
39
How can we provide a better picture of learning over a particular school year using the spring ITBS/ITED? How can we better address our dip in transition year scores?
40
How can we better address the concerns of summer learning loss? Can we avoid having to estimate the effects of one grade level from another since we no long would take the tests only three months into the school year?
41
1.How might this change help our current reality? 2.How can we use our transition to spring testing to better prepare for the tests? 3.How can we provide a better picture of learning over a particular school year? 4.How can we better address our dip in transition year scores? 5.How can we better address the concerns of summer learning loss? 6.Can we avoid having to estimate the effects of one grade level from another? Moving to Spring ITBS/ITED
42
Will simply changing to spring testing with four additional months to prepare, automatically improve our scores? What about norms?
43
Norms compare a student’s raw score, the number of problems the student got correct, with the scores from other students who took the same test at approximately the same time of the school year.
44
How much a student “knows” is inferred from their standing or rank within that comparison or norm group.
45
ITBS & ITED Number of Correct Items Required for Proficiency FormCodeContent Proficiency FallMidyearSpring# of items 3 rd Grade-- Level 9 A RCReading Comprehension15171937 M1Math Concepts & Estimation 16171931 M2 Math P. S*. & Data Interpret. 10121322 SCScience13141530 B RCReading Comprehension17192137 M1Math Concepts & Estimation 15161831 M2 Math P. S. & Data Interpret. 11121422 SCScience13141630
46
Simply changing to spring testing with four additional months to prepare, will not automatically improve our scores.
47
We may even experience a dip in scores?
48
Spring ITBS / ITED Testing We are confident Improved curriculum, instruction & assessments Increased ownership and motivation Improved student achievement and proficiency on the ITBS/ITED.
49
Spring ITBS / ITED Testing? Beginning with the 2010-11 school year, we will be moving to spring testing. Consensus from all stakeholder groups Board acknowledgement and support
50
Preparing for Spring Testing March 14 – April 1 Results will arrive mid-May District Assessment Calendar Adjustments (ELDA in Feb, DRA-2 to 4 th qrt.) Improve curriculum, instruction & assessment Increase formative assessment & differentiation Use Pinnacle to analyze and triangulate data throughout the year Increase ownership and motivation Learn from each other
51
What other achievement data should we be looking at?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.