Download presentation
Presentation is loading. Please wait.
Published byJustin Cross Modified over 9 years ago
1
PACTA PIL October 18, 2010
2
Agenda 11:15 – 12:45 12:45 – 1:30 1:30 – 3:00 3:00 – 3:15 3:15 – 5:00 5:00 – 5:45 5:45 – 6:00 Overview of the School Improvement Process Data, Data, Everywhere LUNCH Data Analysis Break Root Cause Analysis Procedures Sharing & Reporting Out Wrap-up and Evaluation
3
What impacts (has an effect on) heart health? Family History Diet Exercise Smoking
4
What are indicators of heart health? Family History Diet - Weight Exercise – Resting heart rate Cholesterol Triglycerides Blood pressure
5
What are the impacts and indicators of heart disease? If we are to improve the health of our hearts, we need to be aware of both the impacts of heart health as well as the indicators of heart health. Another example: STEELERS FOOTBALL!
6
What impacts Student Achievement?
7
What are indicators of Student Achievement?
9
School Improvement Cycle Collect or Gather Data Analyze Data Root Cause Analysis Action Plan Developed Implement Action Plan & Monitor Results
10
Data Informed Decision Making Cycle for __________ IMPROVEMENT A HA! A Theory!! A Plan! Now Do It The Hunt for Evidence Did it work? Data analysis Strategic Planning Resources Data Remember: Numbers are our friends
11
Using Data to Improve Learning for All: A Collaborative Inquiry Approach by Nancy Love et al. Added by Shula
12
PDE’s Getting Results
13
Focused and Un-forcused Improvement Cycles a la Bernhardt
14
Impacts and Indicators of Student Achievement
16
Types of Data a la Bernhardt – Indicators & Impacts Demographics School Processes Student Learning Perceptions
17
What are your Impacts and Indicators? Identify your Impact and Indicators by their type Blue Dots – Student Learning Yellow Dots – Demographics Red Dots – School Processes and Programs Green Dots – Perceptions
18
Multiple Measures of Student Learning Summative Assessments PSSA NOCTI NAEP Formative Assessments Informal teacher observations Interim Assessments 4Sight Grades Diagnostic Assessments CDT Indicators
19
Multiple Measures of Student Learning 4Sight PSSA NOCTI
20
Multiple Measures of Student Learning – over time Longitudinal Data Analysis of annual performance Analysis of across the years Analysis of cohort groups across the years (8 th grade vs. 11 th grade) PVAAS
21
What are YOUR measures of Student Learning Data? (summative, formative, interim, diagnostic) PSSA NOCTI Grades
22
Multiple Measures of Demographics Typical Data: Ethnicity IEP Economically Disadvantaged Gender Mobility Enrollment Attendance Teacher Demographics? Impacts
23
Demographics to Disaggregate Disaggregation is not a problem solving strategy….. It’s a problem finding strategy.
24
Student Learning AND Demographics Are all students performing at the same level? IEP students? LEP students? Economically disadvantaged students? Is the achievement gap (between high and low poverty students) decreasing or increasing? Do students who attend school every day get better grades? Are achievement levels higher for those students who stay in a school building for two or more years?
25
What are your demographic measures? Students Community Teachers
26
Multiple Measures of School Processes (Programs) Typical Data: Description of school programs and processes. How are students identified for programs and services? Impacts and Indicators
27
Student Learning AND Demographics AND School Processes Are there differences in achievement scores (or in rates of progress) for 11th grade females and males by the type of career program in which they are enrolled?
28
What are your programs or procedures/processes? Tutoring Title I Grading policy Enrollment into a CTC Part time CTC transportation issue
29
Multiple Measures of Perceptions Typical Data: Perceptions of Learning Environment School Climate Values and Beliefs Observations Impacts
30
Student Learning AND Demographics AND Perceptions Do students of different ethnicities perceive the learning environment differently, and do they score differently on standardized achievement tests consistent with these perceptions?
31
What are your measures of perception? Teachers’ Students’ Parents’ Sending Districts’
32
Identify your Impact and Indicators by their type Blue Dots – Student Learning Yellow Dots – Demographics Red Dots – School Processes and Programs Green Dots – Perceptions What’s missing? What additional data should be collected? Examined? Considered?
33
Now that the data is gathered (or it is on the to be gathered list), it’s time to analyze the data. Remember…..”Numbers are our friends”
34
Data Analysis “Gathering” your PSSA data using the Feeder Report from eMetric What percent of 11 th graders (in 2010) scored Below Basic, Basic, Proficient, and Advanced in Reading? Math? (These are this years’ 12 th graders) What about the class of 2010 (PSSA grade 11 in 2009)? The class of 2009 (PSSA grade 11 in 2008)? Examine the three year trend of 11 th grade performance in reading and math. Observations – just the facts! Are more students reaching proficiency? Are fewer students below basic? Repeat the above looking at: Current 9 th graders (8 th graders in 2010) Current 10 th graders’ 8 th grade PSSA scores (from 2009) Current 11 th graders’ 8 th grade PSSA scores (from 2008)
35
Data Analysis Disaggregation is a problem finding strategy! 11 th Grade By Program By Gender By Sending District By Reporting Categories 8 th Grade By Program By Gender By Sending District By Reporting Categories
36
“Root Cause Analysis”
37
Rule #1 – no blaming others The Blame Poem
38
Observation and Reflection What are you seeing? JUST THE FACTS! What are you thinking about the results? What’s ‘causing’ these results? More females are proficient than males. Over the past three years, the percent of students reaching proficiency has increased. The percent of students below basic has remained constant over three years. Students don’t arrive at ‘my grade level’ as prepared as they should be. Support programs are lacking.
39
Root Cause Analysis (Paul Preuss) Definition – the deepest underlying cause, or causes, of positive or negative symptoms within any process that if dissolved would result in elimination, or substantial reduction, of the symptom. Root cause analysis eliminates patching and wasted effort. Root cause analysis conserves scarce resources. Root cause analysis induces discussion and reflection.
40
How do you know you’ve ‘found’ the root cause? You run into a dead end asking what caused the proposed root cause. Everyone agrees that this is a root cause. The cause is logical, makes sense, and provides clarify to the problem. The cause is something that you can influence and control. If the cause is dissolved, there is realistic hope that the problem can be reduced or prevented in the future.
41
“School improvement teams and others using root cause analysis often wonder when to stop seeking cause and make the decision that sufficient data and effort have been used to arrive at a reasonable root. This is often a judgment call that will improve with experience. Often, the lack of data and the pressures of time frustrate the effort and force it to halt at a level below the surface symptom, but perhaps not as deep as it must ultimately go.” (Preuss 2003)
42
Root Cause Analysis – prerequisites Key Indicators of Student Success Measures of each indicator Desired Ideal Condition of the indicator (e.g., 56% proficient or better) Gap between the desired ideal condition and the present condition Is this gap a priority issue? Goal statement Search for Root Cause Possible strategies for improvement
43
Root Cause Processes Questioning the Data The Diagnostic Tree The Five Whys Force Field Analysis Throughout each process, reflect back on to your list of impacts and indicators
44
Questioning Data 1. “What do you see?” 2. “What questions do you have about what you see?” Questioning the Data a la Dr. Shula: 1. What do you see? – JUST THE FACTS 2. What are you thinking/feeling/believing about what you see? 3. What other data or data analysis might shed more light on the issue?
45
The Diagnostic Tree The “Red Flag” event or priority issue Location Level Hypotheses Level PSSA Math scores are below AYP target Location - incoming 9 th graders from X Middle Schools Hypotheses – Is this related to Student Demographics? Curriculum? Instruction? System Processes? Organizational Culture? External Factors: Red Flag ReadingHyp 1Hyp 2MathHyp 3
46
The Five Whys Why? Team: Why do we have so many class tardies? Students: Because we do not have enough time. Team: Why don’t you have enough time to get from one class to another? Students: Because 4 minutes isn’t enough time to get from one end of the building to the next and go to locker or rest room. Team: Why only 4 minutes? Principal: Because we wanted to reduce the time that students were in the halls. Team: Why did we want to reduce the hall time? Principal: Because we wanted to reduce disciplinary problems. Team: Why did we want to reduce disciplinary problems? Principal: We wanted to improve school safety and climate.
47
Force Field Analysis Driving Forces and Restraining Forces Driving Forces apply pressure to move in a direction of change Restraining Forces apply pressure to remain in place Either the driving forces have to be increased or the restraining forces have to be decreased.
49
The Final Report – PACTA PIL Program Brief introduction Three year analysis of reading and math scores Strengths Deficiencies Root Causes for each CTE Program Action Plans to address the Root Cause Timeline for Implementing & Monitoring
50
New insights? Additional data/information to be gathered and examined? New theories? Next steps
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.