Presentation is loading. Please wait.

Presentation is loading. Please wait.

Consider the Evidence Evidence-driven decision making

Similar presentations


Presentation on theme: "Consider the Evidence Evidence-driven decision making"— Presentation transcript:

1 Consider the Evidence Evidence-driven decision making
for secondary schools A resource to assist schools to review their use of data and other evidence 6 Getting to Information

2 Evidence-driven decision making
This module is part of a resource about how we use data and other evidence to improve teaching, learning and student achievement Today we are looking at the second stage of this process – collecting and analysing evidence, and getting to information we can use This session will help us think about analysing ‘data and other evidence’ and how we consider the information we get from analysis. We can apply today’s material to student achievement at all secondary levels - not just senior – and to all curriculum learning areas and school processes. This material is part of a resource explaining evidence-driven decision making. What is evidence-driven decision making? This resource does not justify the use of data analysis in schools – the assumption is that teachers and schools already understand the benefits and to some extent already do it. This resource does not provide data processing tools or directions on how to analyse data. These are readily available.

3 Evidence Any facts, circumstances or perceptions that can be used as an input for an analysis or decision how classes are compiled, how classes are allocated to teachers, test results, teachers’ observations, attendance data, portfolios of work, student opinions … Data are one form of evidence It might be worth recapping what we mean by ‘evidence’ – you could omit this slide if the idea is well understood. Like all schools, we have access to a lot of ‘data’ about student achievement and student behaviour – test results, attendance patterns, etc. But we have access to a lot more information than what is normally thought of as ‘data’. In this session we want to be aware of all the ‘evidence’ we have access to. Some of this evidence is ‘data’ - but some (like student opinions, teachers’ observations) can’t be easily processed in the way we process ‘data’ – so it’s best called ‘evidence’. If participants have concerns about the use of jargon, here’s a way to discuss the issue: Whenever people come to grips with new ideas, they might have to learn new terms or give special meaning to existing words. This happened with curriculum and assessment developments – but most teachers (and parents) are now familiar with terms and concepts like strands, levels and credits. The language of computing is another good example.

4 Data Known facts or measurements, probably expressed in some systematic or symbolic way (eg as numbers) assessment results, gender, attendance, ethnicity … Data are one form of evidence It might be worth recapping what we mean by ‘evidence’ – you could omit this slide if the idea is well understood. This resource treats the word ‘data’ as a plural noun – hence ‘data are …’. There’s nothing new here - but for some of you, this will be quite a narrow definition of data. We could have a discussion about what constitutes data – for example, do all data have to be coded in some way (eg as a number)? But I suggest we simply accept this distinction for the purposes of this session. The main point is: If we want to improve student achievement, we can look at a lot more than what we traditionally think of as data.

5 What evidence does a school have?
Demographics Student achievement Perceptions School processes Other practice It might be worth recapping what we mean by ‘evidence’ – you could omit this slide if the idea is well understood. This is an introductory slide – the next five slides deal with each category in turn. All schools have data about student achievement. To make the most of these data, we need to take be aware of many other factors - evidence that describes our students’ wider learning environment. We have so much evidence that it’s useful to categorise it in some way. The approach we are taking today separates all data and other evidence into these five categories. If you have already done an exercise that uses data, you could use these categories to discuss what sort of evidence you used in the exercise – and what other evidence could be used to extend the example.

6 The evidence-driven decision making cycle
Explore Check data and evidence to explore the issue Question Clarify the issue and ask a question Assemble Decide what data and evidence might be useful Analyse data and evidence Trigger Data indicate a possible issue that could impact on student achievement Evaluate the impact on the intervention Intervene Plan an action aimed at improving student achievement Interpret Insights that answer your question Speculate A teacher has a hunch about a problem or a possible action Act Carry out the intervention Reflect on what has been learned, how practice will change This is the full evidence-driven decision making cycle. Today we are looking at the boxes labelled Assemble Analyse and Interpret. If you used the Presentation 5 - Getting Started, you will have covered the first three stages.

7 The evidence-driven decision making cycle
Explore data Survey of students shows that this is only partially true Question What are the characteristics of students who are poor at writing? Assemble more data & other evidence: asTTle reading, homework, extracurric, Attendance, etc Analyse NQF/NCEA results by standard Trigger Significant numbers not achieving well in writing Analyse non NQF/NCEA data and evidence Intervene Create multiple opportunities for writing; include topics that can use sport as context; connect speaking and writing. PD for staff. Interpret information Poor writers likely to play sport, speak well, read less, do little HW A teacher has a hunch - poor writers might spend little time on homework Evaluate Has writing improved? Reflect How will we teach writing in the future? . This is the evidence-based decision making model outlined in the resource. It looks like a standard improvement cycle, but the approaches recommended within the separate stages make it especially useful for use in schools.

8 Evidence-driven decision making Getting to information
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide lists the stages in the cycle and gives simple explanations. You might prefer to print handout copies of the text below or the cycle diagram in a later slide. The cycle has these sequential stages – today we are going to look at the three middle stages: Assemble Get together all the data and evidence you might need - some will already exist and some will have to be generated for the occasion. Analyse Process sets of data and relate them to other evidence. You are looking for trends and results that will answer your questions (but watch out for unexpected results that might suggest a new question). Interpret Think about the results of the analysis and clarify the knowledge and insights you think you have gained. Interrogate the information. It’s important to look at the information critically. Was the data valid and reliable enough to lead you to firm conclusions? Do the results really mean what they seems to mean? How sure are you about the outcome? What aspects of the information lead to possible action?

9 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? > Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? Once we have a very good question, we need to consider what data and other evidence will help us answer it.

10 Assembling the evidence
We want to know if our senior students are doing better in one area of NCEA Biology than another. So … we need NCEA results for our cohort. It could be that all Biology students do better in this area than others. So … we also need data about national differences across the two areas. Often the data we need will be obvious. But we need to make sure that we have looked at all angles, to ensure we have all the data we need to draw valid conclusions.

11 Are our data any good? A school found that a set of asTTle scores indicated that almost all students were achieving at lower levels than earlier in the year. Then they discovered that the first test had been conducted in the morning, but the later test was in the afternoon and soon after the students had sat a two-hour exam. We should always think critically about the available data and other evidence before we decide to analyse it. If we find the results of a test a bit surprising, we should look closely at the test itself - was it set at an appropriate level for that group of students? Ask questions about how tests were administered. This might seem an unlikely scenario, but it apparently happened.

12 Think critically about data
Was the assessment that created this data assessing exactly what we are looking for? Was the assessment set at an appropriate level for this group of students? Was the assessment properly administered? Are we comparing data for matched groups? These questions are really about validity and reliability. We can make commonsense judgements about whether data is valid for our purposes and whether it was created under conditions we can rely on.

13 Cautionary tale 1 You want to look at changes in a cohort’s asTTle writing levels over 12 months. Was the assessment conducted at the same time both years? Was it administered under the same conditions? Has there been high turnover in the cohort? If so, will it be valid to compare results? We should always ask if there are data-related factors that might have a bearing on the issue in question.

14 Cautionary tale 2 You have data that show two classes have comparable mathematics ability. But end-of-year assessments show one class achieved far better than the other. What could have caused this? Was the original data flawed? How did teaching methods differ? Was the timetable a factor? Did you survey student views? Are the classes comparable in terms of attendance, etc? We need to take care when we think we have matched groups.

15 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together > Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? We now move on to what we do with data and other evidence – what we normally think of as ‘data analysis’.

16 Analysing data and other evidence
Schools need some staff members who are responsible for leading data analysis Schools have access to electronic tools to process data into graphs and tables All teachers do data analysis Data is not an end in itself - it’s one of the many stages along the way to evidence-driven decision making The intent here is to stress that data analysis is just one step in a process (not even the most crucial one) and that every teacher does out some data analysis. All schools have access to tools that will generate, for example, gender and ethnicity comparisons, intra subject analyses, comparisons with national results.

17 Types of analysis We can compare achievement data by subject or across subjects for an individual student groups of students whole cohorts The type of analysis we use depends on the question we want to answer The following slides explain three of the most common types of analysis used in schools: Inter subject analysis Intra subject analysis Longitudinal analysis The aim here is to make a start on thinking about what you can do with data. It’s not a theory lesson. Teachers will be familiar with these three types of data analysis – we introduce them through discussing questions because asking the right questions is a major theme of this resource. These examples use only student achievement data in order to highlight the different approaches. The way we analyse data depends on the question we are trying to answer … let’s look at three examples.

18 Inter-subject analysis
Have my students not achieved a particular history standard because they have poor formal writing skills, rather than poor history knowledge? We can explore this question this by cross referencing the History results of individual students with their formal writing results in English. If the trend is for your students to do less well in formal writing than in other aspects of English and/or other aspect of History, the answer could be Yes.

19 Intra-subject analysis
What are the areas of strength and weakness in my own teaching of this class? We could compare the externally assessed NCEA results of our students with results from appropriately matched schools using published national data. Where these differences are greater or less than average, areas of strength/weakness may exist. For example, we might find that, on average, your students have gained credits at a rate 5 percentage points better than the comparison schools. But in one standard the difference is 15 points, indicating a possible area of strength. In another standard, there is zero difference, indicating a possible area of weakness. This may be a good time to discuss NCEA data. It has been described as ‘both rich and subtle’’. Your school’s NCEA data gives you access to a huge amount of fine-grained information. You can also aggregate NCEA data to show trends, etc – but you need to be careful that in aggregating the data you don’t lose the subtlety and even produce misleading information.

20 Longitudinal analysis
Are we producing better results over time in year 11 biology? We can compare NCEA Biology results for successive year 11 cohorts at this school with the national cohorts for successive years. But the results for the national cohorts might be improving too. So we need to work out how the school’s cohort differs from the national cohort in each year. If the school’s rate of change is better than the changes for the national cohort, the answer could be Yes. To be sure that our teaching is producing this improvement, we should look at the overall levels of achievement of each cohort – otherwise, any improvement could be a result of more able students, rather than better teaching. You could discuss longitudinal analysis of student performance. That would require comparable assessments over successive years – NCEA results at successive levels probably would not suffice for this.

21 Basic analysis All teachers do data some analysis.
This set of results for year 12 students in one subject can be ‘cut’ in many ways, even without access to statistical tools. Suggest some ways we could analyses these data without using statistical tools. Some ways to cut this data are suggested in the next slide.

22 Basic analysis Divide the class into three groups on the basis of overall achievement Identify students who are doing so well at level 2 that they could be working at a higher level Find trends for males and females, those who are absent often, or have many detentions Compare this group’s external assessment success rate with the national cohort. Some ways to cut the data from the previous slide.

23 Reading levels – terms 1 and 4
. This table is a simple example of an application of an analysis tool that any teacher could use - it shows how reading levels have changed within the school year. Any teacher could interpret the table. But (as we will see later) we need to be careful about the conclusions we reach. This example uses asTTle results – level 5B is higher than level 4B. It should not be a problem if teachers are not familiar with asTTle

24 Making sense of the results
Think about significance and confidence How significant are any apparent trends? How much confidence can we have in the information? Even if we don’t fully understand how the analysis was carried out, we should still think critically about the results. Ask the person who did the analysis what has been done, what they think the results show, what limitations they would place on the results. Significance and confidence are key issues.

25 Making sense of the results
This table shows that reading levels overall were higher in term 4 than in term 1. Scores improved for most students. 20% of students moved into Level 5. But the median score is still 4A. You might need to flick back to the previous slide to discuss this slide. This is useful summative information for reporting against targets and as general feedback to teachers. But it’s not information we could act on. In fact, is it actually information? In a sense, this is still data - a graphical representation of two sets of related data. Is this information? Can we act on it?

26 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence > Interpret What information do we have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? The analysis has been completed – but what information do we have. What do we mean by ‘information’?

27 Making sense of information
Data becomes information when it is categorised, analysed, summarised and placed in context. Information therefore is data endowed with relevance and purpose. Information is developed into knowledge when it is used to make comparisons, assess consequences, establish connections and engage in dialogue. Knowledge … can be seen as information that comes laden with experience, judgment, intuition and values. Empson (1999) cited in Mason (2003)

28 Information Knowledge gained from analysing data and making meaning from evidence Information is knowledge (or understanding) that can inform your decisions. How certain you will be about this knowledge depends on a number of factors: where your data came from, how reliable it was, how rigorous your analysis was. So the information you get from analysing data could be a conclusion, a trend, a possibility. This is a slide from the Terminology section in the Appendix. You need to decide how much it’s worth discussing this. Let’s think about what counts as information.

29 Information Summative information is useful for reporting against targets and as general feedback to teachers. Formative information is information we can act on – it informs decision-making that can improve learning. This slide looks at the same distinctions made earlier about evaluation (slide 32) and questions.

30 Reading levels – terms 1 and 4
To get formative information from the asTTle graph to inform decision-making we’d need to go further. We need to ask more questions. We’d need to disaggregate the data. Then we’ll have information we can act on. (In fact, this graph has been created by aggregating separate asTTle results - so you’d probably go back to the original data and aggregate it in different ways.)

31 Questions to elicit information
Did the more able students make significant progress, but not the lower quartile? How have the scores of individual students changed? How many remain on the same level? How much have our teaching approaches contributed to this result? How much of this shift in scores is due to students’ predictable progress? Is there any data that will enable us to compare our students with a national cohort? How does this shift compare with previous Year 9 cohorts? This slide refers to the asTTle graph shown earlier. To get information out of our data analysis, we need to ask penetrating questions.

32 Words, words, words … Information can … establish, indicate, confirm, reinforce, back up, stress, highlight, state, imply, suggest, hint at, cast doubt on, refute … Does this confirm that …? What does this suggest? What are the implications of …? How confident are we about this conclusion? It’s worth thinking about precisely how we express information, especially in relation to the question that triggered the analysis. The verbs we choose reflect the confidence we have in the information. You can even see a hierarchy. When we interrogate the information we use even more questions. The answers will often come from our professional judgement.

33 Interrogate the information
Is this the sort of result we envisaged? If not, why? How does this information compare with the results of other research or the experiences of other schools? Are there other variables that could account for this result? Should we set this information alongside other data or evidence to give us richer information? What new questions arise from this information? There are two slides showing questions we could ask of the information. Then there is a slide containing an activity . You might prefer to do the activity first. We need to interrogate the information generated by analysing data and evidence. First, we think about how confident we are about the result, by asking questions like this.

34 Interrogate the information
Does this relate to student achievement - or does it actually tell us something about our teaching practices? Does this information suggest that the school’s strategic goals and targets are realistic and achievable? If not, how should they change, or should we change? Does the information suggest we need to modify programmes or design different programmes? Does the information suggest changes need to be made to school systems? There are two slides showing questions we could ask of the information. Then there is a slide containing an activity . You might prefer to do the activity first. We should ask questions like this.

35 Interrogate the information
What effect is the new 6-day x 50-min period structure having on student engagement levels? Imagine we asked this question. When we see the information resulting from data analysis, what questions might we ask? This is an activity to demonstrate the sort of questioning covered in the two previous slides. Some suggestions are offered in the next slide. You could use just the next slide and omit the activity.

36 Interrogate the information
What effect is the new 6-day x 50-min period structure having on student engagement levels? Do student views align with staff views? Do positive effects outweigh negative effects? Is there justification for reviewing the policy? Does the information imply changes need to be made to teaching practices or techniques? Does the information offer any hint about what sort of changes might work? Some suggestions for the activity in the previous slide.

37 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? End of presentation. The next step could be for the group to discuss how the school currently analyses data and other evidence. Beware of collecting data and other evidence that you might not need. If the school thinks ahead about how to make evidence-based decisions, you will know what data and other evidence you should collect. Remember not to make analysis an end in itself – remind the group of where this fits in the full improvement cycle.


Download ppt "Consider the Evidence Evidence-driven decision making"

Similar presentations


Ads by Google