Presentation is loading. Please wait.

Presentation is loading. Please wait.

Consider the Evidence Evidence-driven decision making

Similar presentations


Presentation on theme: "Consider the Evidence Evidence-driven decision making"— Presentation transcript:

1 Consider the Evidence Evidence-driven decision making
for secondary schools A resource to assist schools to review their use of data and other evidence 8 Collated examples

2 Evidence-driven teaching
I had a hunch that Ana wasn’t doing as well as she could in her research assignments, a major part of the history course. What made me think this? Ana’s general work (especially her writing) was fine. She made perceptive comments in class, contributed well in groups and had good results overall last year, especially in English. How did I decide what to do about it? I looked more closely at her other work. I watched her working in the library one day to see if it was her reading, her use of resources, her note taking, her planning, or what. At morning tea I asked one of Ana’s other teachers about Ana’s approach to similar tasks. I asked Ana if she knew why her research results weren’t as good as her other results, and what her plans were for the next assignment. I thought about all of this and planned a course of action. I gave her help with using indexes, searching, note taking and planning and linking the various stages of her research. The aim here is to demonstrate (and acknowledge) that teachers look for and use a variety of evidence as a normal part of effective and reflective teaching. The conclusions the teacher reaches are not as important here as the investigative approach he uses . Teachers continually consider what they know about students. This story told by Ana’s history teacher is typical. It’s not a story about a formal investigation. It’s just the sort of thing good teachers do all the time. It might have taken place over just a week or two and taken very little of the teacher’s time. This teacher had a ‘hunch’ based on his general professional observations. He informally compared a range of evidence to see if his hunch was correct. It was. He wanted to find a way to improve this one aspect of Ana’s achievement. He considered other evidence and analysed it. This enabled him to pinpoint the problem and plan a course of action designed to improve Ana’s achievement. This teacher was thinking about the data and other evidence he had right there in front of him - and then he acted on his conclusions. The teacher used evidence-driven decision making, using data and other evidence to inform his actions. In this session we want to see how to expand (and systematise) that sort of thinking to drive improvement across the school.

3 What can we do with evidence?
Shane’s story A history HOD wants to see whether history students are performing to their potential. She prints the latest internally assessed NCEA records for history students across all of their subjects. As a group, history students seem to be doing as well in history as they are in other subjects. Then she notices that Shane is doing very well in English and only reasonably well in history. She wonders why, especially as both are language-rich subjects with many similarities. The HOD speaks with the history teacher, who says Shane is attentive, catches on quickly and usually does all work required. He mentions that Shane is regularly late for class, especially on Monday and Thursday. So he often misses important information or takes time to settle in. He has heard there are ‘problems at home’ so has overlooked it, especially as the student is doing reasonably well in history. If you choose to use this scenario, you might like to print paper copies for distribution. It is shown here on two slides. You might prefer to replace Shane’s story with a scenario from your own school. Teachers are engaging every day in decision making that considers data and other evidence. This is a typical scenario - a teacher notices something interesting in a student’s achievement data and wonders if there is an explanation. Continued on next page of notes.

4 Shane’s story (cont...) The HOD looks at the timetable and discovers that history is period 1 on Monday and Thursday. She speaks to Shane’s form teacher who says that she suspects Shane is actually late to school virtually every day. They look at centralised records. Shane has excellent attendance but frequent lateness to period 1 classes. The HOD speaks to the dean who explains that Shane has to take his younger sister to school each morning. He had raised the issue with Shane but he said this was helping the household get over a difficult period and claimed he could handle it. The staff involved agree that Shane’s regular lateness is having a demonstrable impact on his achievement, probably beyond history but not so obviously. The dean undertakes to speak to the student, history teacher, and possibly the parents to find a remedy for the situation. Continued from previous page of notes. The next slide suggests that you consider the key factors in this scenario.

5 Shane’s story - keys to success
The history HOD looked at achievement data in English and history. She looked for something significant across the two data sets, not just low achievement. Then she asked a simple question: Why is there such a disparity between in these two subjects for that student? She sought information and comments (perceptions evidence and data) from all relevant staff. The school had centralised attendance and punctuality records (demographic data) that form teacher could access easily. The action was based on all available evidence and designed to achieve a clear aim. A suggested list from the previous slide.

6 Evidence-driven strategic planning
INDICATORS FROM DATA asTTle scores show a high proportion of Yr 9 achieving below curriculum level NCEA results show high non- achievement in transactional writing Poor results in other language NCEA standards etc STRATEGIC GOAL To raise the levels of writing across the school Strategic action Develop a writing development plan which addresses writing across subjects and levels , including targets, professional development and other resourcing needs ANNUAL PLAN Develop and implement a plan to raise levels of Writing at Year 9 Development plan to be based on an analysis of all available data and to include a range of shared strategies YEAR TARGET Raise writing asTTle results Yr 9boys from 3B to 3A Appraisal P D Self review School charter EVALUATION DATA asTTle writing results improve by … Perception data from Yr 9 staff indicates … Evaluation of effectiveness of range of shared strategies, barriers and enablers … . If you choose to use this chart, you might like to print paper copies for distribution. The diagram indicates one way to think of evidence-driven strategic planning. In this model, data from a range of sources provide ‘indicators’ of a problem - one aspect of student achievement (in this case, writing) stands out across the school as one that could be improved. This leads the Board to establish a strategic goal. They then add an appropriate aim and a target (with measurable outcomes) to the school’s annual plan. School leaders would then create a development plan. To do this they will need to go back to the indicator data and analyse these data alongside other data and evidence. Then the development plan is implemented. At the end of the year other data and evidence are analysed to evaluate the success of the development plan as a whole and the various strategies that were used. The data used for evaluation will probably be different from those used to identify the problem and develop the action plan. (In this case, for example, the current NCEA results were not relevant for Year 9 students, and other data was collected to evaluate some of the actions taken.)

7 The evidence-driven decision making cycle
Explore data Survey of students shows that this is only partially true Question What are the characteristics of students who are poor at writing? Assemble more data & other evidence: asTTle reading, homework, extracurric, Attendance, etc Analyse NQF/NCEA results by standard Trigger Significant numbers not achieving well in writing Analyse non NQF/NCEA data and evidence Intervene Create multiple opportunities for writing; include topics that can use sport as context; connect speaking and writing. PD for staff. Interpret information Poor writers likely to play sport, speak well, read less, do little HW A teacher has a hunch - poor writers might spend little time on homework Evaluate Has writing improved? Reflect How will we teach writing in the future? . An sample scenario for the activity in the previous slide. You might prefer to print handout copies.

8 More purposeful questions
How do year 11 attendance rates compare with other year levels? Do any identifiable groups of year 11 students attend less regularly than average? Is the new 6-day x 50-min period structure having any positive effect on student engagement levels? Is it influencing attendance patterns? What do students say? Should we be concerned about boys’ writing? If so, what action should we be taking to improve the writing of boys in terms of the literacy requirements for NCEA Level 1? The new timing of the lunch break was intended to improve student engagement levels after lunch. Did it achieve this? If so, did improvements in student engagement improve student achievement? Do the benefits outweigh any disadvantages? Some more purposeful questions from the previous slide.

9 Assembling the evidence
We want to know if our senior students are doing better in one area of NCEA biology than another. So … we need NCEA results for our cohort. It could be that all biology students do better in this area than others. So … we also need data about national differences across the two areas. Once we have a very good question, we need to consider what data and other evidence will help us answer it. Often the data we need will be obvious. But we need to make sure that we have looked at all angles, to ensure we have all the data we need to draw valid conclusions.

10 Are our data any good? A school found that a set of asTTle scores indicated that almost all students were achieving at lower levels than earlier in the year. Then they discovered that the first test had been conducted in the morning, but the later test was in the afternoon and soon after the students had sat a two-hour exam. We should always think critically about the available data and other evidence before we decide to analyse it. If we find the results of a test a bit surprising, we should look closely at the test itself - was it set at an appropriate level for that group of students? Ask questions about how tests were administered. This might seem an unlikely scenario, but it apparently happened.

11 Think critically about data
Was the assessment that created this data assessing exactly what we are looking for? Was the assessment set at an appropriate level for this group of students? Was the assessment properly administered? Are we comparing data for matched groups? These questions are really about validity and reliability. We can make commonsense judgements about whether data is valid for our purposes and whether it was created under conditions we can rely on.

12 Cautionary tale 1 You want to look at changes in a cohort’s asTTle writing levels over 12 months. Was the assessment conducted at the same time both years? Was it administered under the same conditions? Has there been high turnover in the cohort? If so, will it be valid to compare results? We should always ask if there are data-related factors that might have a bearing on the issue in question.

13 What could have caused this?
Cautionary tale 2 You have data that show two classes have comparable mathematics ability. But end-of-year assessments show one class achieved far better than the other. What could have caused this? Was the original data flawed? How did teaching methods differ? Was the timetable a factor? Did you survey student views? Are the classes comparable in terms of attendance, etc? We need to take care when we think we have matched groups.

14 Hunches from raw data This slide is simply to point out that teachers routinely look at raw data and start to get hunches. We return to this table in the Analyse section of the presentation. This table is a set of results for year 12 students in one subject. Most of us would get hunches about trends and issues from a quick read of raw data like this. What trends and issues can we get from just scanning this table? Until careful analysis is done, it’s best to pose these hunches as questions. Some questions are suggested in the next slide.

15 Hunches from raw data Is the class as a whole doing better in internally assessed standards than in externally assessed standards? If so, why? Are the better students (with many Excellence results) not doing as well in external assessments as in internal? If so, why? Is there any relationship between absences and achievement levels? It seems not, but it’s worth analysing the data to be sure. Hunches (expressed as questions) from the previous slide. We return to this table in the Analyse section of the presentation.

16 Reading levels – terms 1 and 4
. This table is a simple example of an application of an analysis tool that any teacher could use - it shows how reading levels have changed within the school year. Any teacher could interpret the information in the table. But (as we will see later) we need to be careful about the conclusions we reach. This example uses asTTle results – level 5B is higher than level 4B. It should not be a problem if teachers are not familiar with asTTle

17 Basic analysis All teachers do data some analysis.
This set of results for year 12 students in one subject can be ‘cut’ in many ways, even without access to statistical tools. Suggest some ways we could analyses these data without using statistical tools. Some ways to cut this data are suggested in the next slide.

18 Basic analysis Divide the class into three groups on the basis of overall achievement Identify students who are doing so well at level 2 that they could be working at a higher level Find trends for males and females, those who are absent often, or have many detentions Compare this group’s external assessment success rate with the national cohort. Some ways to cut the data from the previous slide.

19 Making sense of the results
This table shows that reading levels overall were higher in term 4 than in term 1. Scores improved for most students. 20% of students moved into Level 5. But the median score is still 4A. You might need to flick back to the previous slide to discuss this slide. This is useful summative information for reporting against targets and as general feedback to teachers. But it’s not information we could act on. In fact, is it actually information? In a sense, this is still data - a graphical representation of two sets of related data. Is this information? Can we act on it?

20 Interrogate the information
What effect is the new 6-day x 50-min period structure having on student engagement levels? Do student views align with staff views? Do positive effects outweigh negative effects? Is there justification for reviewing the policy? Does the information imply changes need to be made to teaching practices or techniques? Does the information offer any hint about what sort of changes might work? Some suggestions for the activity in the previous slide.

21 Professionals making decisions
You asked what factors are related to poor student performance in formal writing. The analysis suggested that poor homework habits have a significant impact on student writing. You make some professional judgements and decide Students who do little homework don’t write enough You could take action to improve homework habits - but you’ve tried that before and the success rate is low You have more control over other factors – like how much time you give students to write in class So you conclude – the real need is to get students to write more often Teachers decided the poor performance in writing was not homework habits but the total amount of writing students do. Professional judgments lead to the conclusion that the action required is to ensure that students do more writing. NOTE – This resource does not provide advice on writing an action plan – see the Appendix for resources on writing action plans.

22 Evaluate the impact of our action
A school created a new year 13 art programme. In the past students had been offered standard design and painting programmes, internally and externally assessed against the full range of achievement standards. Some students had to produce two folios for assessment and were unsure of where to take their art after leaving school. The new programme blended drawing, design and painting concepts and focused on electronic media. Assessment was against internally assessed standards only. Where we simply change our teaching approach, we’ll compare student achievement data to evaluate the impact of the change. But more radical changes are more difficult to evaluate - a wider range of evaluation approaches will be needed. How could the school evaluate the impact of this change? Suggestions on the next slide.

23 Evaluate the impact of our action
Did students complete more assessments? Were students gain more national assessment credits? How did student perceptions of workload and satisfaction compare with teacher perceptions from the previous year? Did students leave school with clearer intentions about where to go next with their art than the previous cohort? How did teachers and parents feel about the change? Some suggestions from the previous slide. Bullet 2: If the school had decided on the change earlier they could have collected student perceptions evidence from the previous cohort. Bullet 3: For this measure the school was able to collect evidence from the previous cohort.


Download ppt "Consider the Evidence Evidence-driven decision making"

Similar presentations


Ads by Google