Download presentation
Presentation is loading. Please wait.
Published bySukarno Muljana Modified over 6 years ago
1
Consider the Evidence Evidence-driven decision making
for secondary schools A resource to assist schools to review their use of data and other evidence 1
2
Evidence-driven decision making
Today we aim to think about how we use data and other evidence to improve teaching, learning and student achievement improve our understanding, confidence and capability in using data to improve practice discuss how we make decisions think about our needs and start to plan our own evidence-based projects This session will help us identify issues and make decisions by analysing ‘data and other evidence’ – we will see how to do this in a more structured and informed way. In a moment we’ll discuss what is meant by ‘data and other evidence’. We can apply today’s material to student achievement at all secondary levels - not just senior – and to all curriculum learning areas and school processes. First, we need to look at some of the terms being used here – so we are all speaking the same language. First: What is evidence-driven decision making? This resource does not justify the use of data analysis in schools – the assumption is that teachers and schools already understand the benefits and to some extent already do it. This resource does not provide data processing tools or directions on how to analyse data. These are readily available. After these sessions, the next step should be for the group to discuss how this model can be applied in this school, department or faculty. This should include a discussion about what evidence already exists in the school, how this is collected and recorded, and how well equipped the school is to analyse and use it in the interests of improving student achievement. Finally, the group should consider what evidence-driven projects the school could undertake. Beware of collecting data and other evidence that you might not need. If the school thinks ahead about how to make evidence-based decisions, you will know what data and other evidence you should collect.
3
Evidence-driven eating
You need to buy lunch. Before you decide what to buy you consider a number of factors: how much money do you have? what do you feel like eating? what will you be having for dinner? how far do you need to go to buy food? how much time do you have? where are you going to eat it? This is a frivolous, non-educational scenario to start you thinking about how schools use evidence to make decisions. Don’t labour it. With many groups (especially experienced teachers) it will be inappropriate to use it at all. This scenario is just to get us started. There’s nothing mysterious about evidence-driven decision making. We all make decisions every day based on an analysis of a number of factors. In this scenario you’d analyse the factors and make a decision in seconds (or you’d go hungry). What other factors you might consider before buying lunch? For example: Who are you eating with? How much do you want to spend? What did you have for breakfast? How hungry are you? Are you on a special diet? What else do you need to do this lunchtime? Who do you want to avoid this lunchtime?
4
Evidence-driven teaching
I had a hunch that Ana wasn’t doing as well as she could in her research assignments, a major part of the history course. What made me think this? Ana’s general work (especially her writing) was fine. She made perceptive comments in class, contributed well in groups and had good results overall last year, especially in English. How did I decide what to do about it? I looked more closely at her other work. I watched her working in the library one day to see if it was her reading, her use of resources, her note taking, her planning, or what. At morning tea I asked one of Ana’s other teachers about Ana’s approach to similar tasks. I asked Ana if she knew why her research results weren’t as good as her other results, and what her plans were for the next assignment. I thought about all of this and planned a course of action. I gave her help with using indexes, searching, note taking and planning and linking the various stages of her research. The aim here is to demonstrate (and acknowledge) that teachers look for and use a variety of evidence as a normal part of effective and reflective teaching. The conclusions the teacher reaches are not as important here as the investigative approach he uses . Teachers continually consider what they know about students. This story told by Ana’s history teacher is typical. It’s not a story about a formal investigation. It’s just the sort of thing good teachers do all the time. It might have taken place over just a week or two and taken very little of the teacher’s time. This teacher had a ‘hunch’ based on his general professional observations. He informally compared a range of evidence to see if his hunch was correct. It was. He wanted to find a way to improve this one aspect of Ana’s achievement. He considered other evidence and analysed it. This enabled him to pinpoint the problem and plan a course of action designed to improve Ana’s achievement. This teacher was thinking about the data and other evidence he had right there in front of him - and then he acted on his conclusions. The teacher used evidence-driven decision making, using data and other evidence to inform his actions. In this session we want to see how to expand (and systematise) that sort of thinking to drive improvement across the school.
5
What is meant by ‘data and other evidence’?
Consider the Evidence A resource to assist schools to review their use of data and other evidence What is meant by ‘data and other evidence’? This is an introductory slide – the next two slides discuss and define ‘evidence’ and ‘data’. This resource uses terms commonly used in schools - the usage and meaning intended is that generally applied in educational circles. A full discussion of terminology used in this resource is in the Appendix. The resource we are using is called ‘Consider the Evidence’ – it deals with ‘evidence-driven decision making’. So we need to think about what constitutes ‘evidence’ in schools. It’s useful to think in terms of ‘data and other evidence’.
6
Evidence Any facts, circumstances or perceptions that can be used as an input for an analysis or decision how classes are compiled, how classes are allocated to teachers, test results, teachers’ observations, attendance data, portfolios of work, student opinions … Data are one form of evidence ‘Evidence’ is used here in the same way that it’s used in courts of law and in standards based assessment. Like all schools, we have access to a lot of ‘data’ about student achievement and student behaviour – test results, attendance patterns, etc. But we have access to a lot more information than what is normally thought of as ‘data’. In this session we want to be aware of all the ‘evidence’ we have access to. Some of this evidence is ‘data’ - but some (like student opinions, teachers’ observations) can’t be easily processed in the way we process ‘data’ – so it’s best called ‘evidence’. If participants have concerns about the use of jargon, here’s a way to discuss the issue: Whenever people come to grips with new ideas, they might have to learn new terms or give special meaning to existing words. This happened with curriculum and assessment developments – but most teachers (and parents) are now familiar with terms and concepts like strands, levels and credits. The language of computing is another good example.
7
Data Known facts or measurements, probably expressed in some systematic or symbolic way (e.g. as numbers) assessment results, gender, attendance, ethnicity … Data are one form of evidence This resource treats the word ‘data’ as a plural noun – hence ‘data are …’. There’s nothing new here - but for some of you, this will be quite a narrow definition of data. We could have a discussion about what constitutes data – for example, do all data have to be coded in some way (eg as a number)? But I suggest you simply accept this distinction for the purposes of this session. We can discuss other terminology later. The main point is: If we want to improve student achievement, we can look at a lot more than what we traditionally think of as data.
8
Which factors are data? Evidence to consider before buying lunch
how much money you have what you feel like eating what you’ll be having for dinner how far you need to go to buy food how much time you have where you’re going to eat what your diet allows If you used the lunch decision scenario (slide 2) it’s worth thinking about what constitutes data. You should also consider any other factors you added earlier. Looking back at the lunch decision scenario – in terms of how we’ve defined ‘data’ and ‘other evidence’, which of these factors would we categorise as ‘data’?
9
Evidence-driven decision making
We have more evidence about what students know and can do than ever before – their achievements, behaviours, environmental factors that influence learning We should draw on all our knowledge about the learning environment to improve student achievement explore what lies behind patterns of achievement decide what changes will make a difference This is a general introduction to evidence-driven decision making. It may not be necessary for groups that are already committed to this sort of approach. But groups intent on change should pause to consider how to minimise effort and risk. We all know that we can make significant improvements to teaching and learning by analysing data and applying what we learn from it. But an issue for many schools is that they have too much data in some areas – too much evidence – and not enough time and resources to use it all effectively. In other areas, we have too little evidence. Any change in teaching practice is a risk - you can never be entirely sure of the consequences. The approach we are looking at today shows us how to decide what we should change to improve student achievement. Evidence-driven decision making can help by reducing or assessing risk, and maybe by pointing out changes that have lesser risk.
10
What evidence does a school have?
Demographics Student achievement Perceptions School processes Other practice This is an introductory slide – the next five slides deal with each category in turn. All schools have data about student achievement. To make the most of these data, we need to take be aware of many other factors - evidence that describes our students’ wider learning environment. We have so much evidence that it’s useful to categorise it in some way. The approach we are taking today separates all data and other evidence into these five categories. If you have already done an exercise that uses data, you could use these categories to discuss what sort of evidence you used in the exercise – and what other evidence could be used to extend the example.
11
Demographics What data do we have now to provide a profile of our school? What other data could we create? School Students Staff Parents/caregivers and community Demographics – also known as Profile data – objective data that describe our school and its students, staff and community – decile, gender, suspensions, etc The next slide provides examples for each bullet point – but first you could do this exercise: Divide into four groups. Take one category each and list our school’s existing data. Then list some other potential sources of data and evidence. That is, how we could collect or generate evidence that our school doesn’t have? You should make the point that data and other evidence should be generated only if it’s for a purpose.
12
Demographics Data that provides a profile of our school
School – decile, roll size, urban/rural, single sex or co-educational, teaching spaces … Students – ethnicity, gender, age, year level, attendance, lateness, suspension and other disciplinary data, previous school, part-time employment … Staff – gender, age, years of experience, qualifications, teaching areas, involvement in national curriculum and assessment, turnover rate … Parents/caregivers and community – socio-economic factors, breadth of school catchment, occupations … A suggested list for each bullet point in the previous slide.
13
Student achievement What evidence do we have now about student achievement? What other evidence could we collect? National assessment results Standardised assessment results administered internally Other in-school assessments Student work Student Achievement data and other evidence - much of this is readily available – from national assessments, standardised testing we carry out in the school, portfolios of student work, etc. The next slide provides examples for each bullet point – but first you could do this exercise: Divide into four groups. Take one category each and list our school’s existing data. Then list some other potential sources of data and evidence. That is, how we could collect or generate evidence that our school doesn’t have? The next slide provides a suggested list.
14
Student achievement Evidence about student achievement
National assessment results - NCEA, NZ Scholarship - details like credits above and below year levels, breadth of subjects entered… Standardised assessment results administered internally - PAT, asTTle … Other in-school assessments - most non-standardised but some, especially within departments, will be consistent across classes - includes data from previous schools, primary/intermediate Student work - work completion rates, internal assessment completion patterns, exercise books, notes, drafts of material - these can provide useful supplementary evidence A suggested list for each bullet point in the previous slide.
15
Perceptions What evidence do we have now about what students, staff and others think about the school? Are there other potential sources? Self appraisal Formal and informal observations made by teachers Structured interactions Externally generated reports Student voice Other informal sources In many schools there will be little of this sort of evidence, so you might spend more time on this. Perceptions - evidence of what staff, students and other think about the school - probably the most subjective evidence but much of it will be factual and collected in formal ways - student self appraisal, formal and informal observations made by teachers, etc. Divide into four groups. Take one category each and list our school’s existing data. Then list some other potential sources of data and evidence. That is, how we could collect or generate evidence that our school doesn’t have? The next slide provides a suggested list.
16
Perceptions Evidence about what students, staff, parents and the community think about the school Self appraisal - student perceptions of their own abilities, potential, achievements, attitudes … Formal and informal observations made by teachers - peer interactions, behaviour, attitudes, engagement, student-teacher relationships, learning styles, classroom dynamics … Structured interactions - records from student interviews, parent interviews, staff conferences on students … Externally generated reports - from ERO and NZQA (these contain data but also perceptions) … Student voice - student surveys, student council submissions … Other informal sources – views about the school environment, staff and student morale, board perceptions, conversations among teachers … A suggested list for each bullet point in the previous slide.
17
School processes What evidence do we have about how our school is organised and operates? Timetable Classes Resources Finance Staffing Some teachers may not think of School Processes as evidence that can be used in decision making – you might need to skip forward to later slides that provide examples. School processes - how our school is organised and operates – the timetable, resources, etc Divide into four groups. Take one category each and list our school’s existing data. Then list some other potential sources of data and evidence. That is, how we could collect or generate evidence that our school doesn’t have? The next slide provides a suggested list.
18
School processes Evidence about how our school is organised and operates School processes - evidence and data about how your school is organised and operates, including: Timetable –structure, period length, placement of breaks, subjects offered, student choices, tertiary and workforce factors, etc Classes - how they are compiled, their characteristics, effect of timetable choices, etc Resources - access to libraries, text books, ICT, special equipment, etc Finance - how the school budget is allocated, how funds are used within departments, expenditure on professional development Staffing - policies and procedures for employing staff, allocating responsibility, special roles, workload, subjects and classes A suggested list for each bullet point in the previous slide.
19
Other practice How can we find out about what has worked (or not) in other schools? Other Practice – we should look at the experiences of others - documented academic research, the experiences of other schools, etc. It’s important that a school’s evidence-driven decision making benefits from the results of research and the experiences of other schools. Discuss sources of research evidence, how you can find out what other schools have done, etc. The next slide provides a suggested list.
20
Other practice How we can find out about what has worked in other schools? Documented research – university and other publications, Ministry of Education’s Best Evidence Syntheses, NZCER, NZARE, overseas equivalents … Experiences of other schools – informal contacts, local clusters, advisory services, TKI LeadSpace … A suggested list from the previous slide.
21
What can we do with evidence?
Shane’s story A history HOD wants to see whether history students are performing to their potential. She prints the latest internally assessed NCEA records for history students across all of their subjects. As a group, history students seem to be doing as well in history as they are in other subjects. Then she notices that Shane is doing very well in English and only reasonably well in history. She wonders why, especially as both are language-rich subjects with many similarities. The HOD speaks with the history teacher, who says Shane is attentive, catches on quickly and usually does all work required. He mentions that Shane is regularly late for class, especially on Monday and Thursday. So he often misses important information or takes time to settle in. He has heard there are ‘problems at home’ so has overlooked it, especially as the student is doing reasonably well in history. If you choose to use this scenario, you might like to print paper copies for distribution. It is shown here on two slides. You might prefer to replace Shane’s story with a scenario from your own school. Teachers are engaging every day in decision making that considers data and other evidence. This is a typical scenario - a teacher notices something interesting in a student’s achievement data and wonders if there is an explanation. Continued on next page of notes.
22
Shane’s story (cont...) The HOD looks at the timetable and discovers that history is Period 1 on Monday and Thursday. She speaks to Shane’s form teacher who says that she suspects Shane is actually late to school virtually every day. They look at centralised records. Shane has excellent attendance but frequent lateness to period 1 classes. The HOD speaks to the dean who explains that Shane has to take his younger sister to school each morning. He had raised the issue with Shane but he said this was helping the household get over a difficult period and claimed he could handle it. The staff involved agree that Shane’s regular lateness is having a demonstrable impact on his achievement, probably beyond history but not so obviously. The dean undertakes to speak to the student, history teacher, and possibly the parents to find a remedy for the situation. Continued from previous page of notes. The next slide suggests that you consider the key factors in this scenario.
23
Thinking about Shane’s story
What were the key factors in the scenario about Shane? What types of data and other evidence were used? What questions did the HOD ask? What happened in this case that wouldn’t necessarily happen in some schools? The main aim here is to get the group thinking about the categories of evidence discussed earlier: Demographics Student Achievement Perceptions School Processes Other Practice Key factors are suggested in the next slide.
24
Shane’s story - keys to success
The history HOD looked at achievement data in English and history. She looked for something significant across the two data sets, not just low achievement. Then she asked a simple question: Why is there such a disparity between in these two subjects for that student? She sought information and comments (perceptions evidence and data) from all relevant staff. The school had centralised attendance and punctuality records (demographic data) that form teacher could access easily. The action was based on all available evidence and designed to achieve a clear aim. A suggested list from the previous slide.
25
Evidence-driven strategic planning
If we use evidence-driven decision making to improve student achievement and enhance teaching practice … … it follows that strategic planning across the school should also be evidence-driven. If we use evidence-driven decision making to improve student achievement and enhance teaching practice, it follows that the school’s strategic planning should also be evidence-driven. An example follows on the next slide.
26
Evidence-driven strategic planning
INDICATORS FROM DATA asTTle scores show a high proportion of year 9 achieving below curriculum level NCEA results show high non- achievement in transactional writing Poor results in other language NCEA standards etc. STRATEGIC GOAL To raise the levels of writing across the school Strategic action Develop a writing development plan which addresses writing across subjects and levels , including targets, professional development and other resourcing needs ANNUAL PLAN Develop and implement a plan to raise levels of Writing at year 9 Development plan to be based on an analysis of all available data and to include a range of shared strategies YEAR TARGET Raise writing asTTle results year 9 boys from 3B to 3A Appraisal P D Self review School charter EVALUATION DATA asTTle writing results improve by … Perception data from Yr 9 staff indicates … Evaluation of effectiveness of range of shared strategies, barriers and enablers … etc . If you choose to use this chart, you might like to print paper copies for distribution. The diagram indicates one way to think of evidence-driven strategic planning. In this model, data from a range of sources provide ‘indicators’ of a problem - one aspect of student achievement (in this case, writing) stands out across the school as one that could be improved. This leads the Board to establish a strategic goal. They then add an appropriate aim and a target (with measurable outcomes) to the school’s annual plan. School leaders would then create a development plan. To do this they will need to go back to the indicator data and analyse these data alongside other data and evidence. Then the development plan is implemented. At the end of the year other data and evidence are analysed to evaluate the success of the development plan as a whole and the various strategies that were used. The data used for evaluation will probably be different from those used to identify the problem and develop the action plan. (In this case, for example, the current NCEA results were not relevant for Year 9 students, and other data was collected to evaluate some of the actions taken.)
27
The evidence-driven decision making cycle
Trigger Explore Question Assemble Analyse Interpret Intervene Evaluate Reflect This slide introduces the stages of the evidence-driven decision making cycle. The next slide lists the headings again and gives simple explanations. The following two slides show these stages in cyclic diagrams. There is a slide for each heading in the Terminology section of this presentation, with the full explanation as in the script below. You might prefer to print handout copies of the text below or the cycle diagram in a later slide. Presentations 5, 6 and 7 deal with these nine stages in three groups. It’s useful to think of an evidence-driven decision-making cycle as having sequential stages: Trigger Data, ideas, hunches, etc set a process in action. The trigger is whatever it is that makes you think there could be an opportunity to improve student achievement. You can routinely scan available data looking for inconsistencies, etc. It can be useful to speculate about possible causes or effects - and then explore data and other evidence to see if there are any grounds for the speculation. Explore Initial data, ideas or hunches usually need some preliminary exploration to pinpoint the issue and suggest good questions to ask. Question This is the key point: what question/s do you want answered. Questions can raise an issue and/or propose a possible solution. Assemble Get together all the data and evidence you might need - some will already exist and some will have to be generated for the occasion. Analyse Process sets of data and relate them to other evidence. You are looking for trends and results that will answer your questions (but watch out for unexpected results that might suggest a new question). Interpret Think about the results of the analysis and clarify the knowledge and insights you think you have gained. Interrogate the information. It’s important to look at the information critically. Was the data valid and reliable enough to lead you to firm conclusions? Do the results really mean what they seems to mean? How sure are you about the outcome? What aspects of the information lead to possible action? Intervene Design and implement a plan of action designed to change the situation you started with. Be sure that your actions are manageable and look at the resourcing needed. Consider how you’ll know what has been achieved. Evaluate Using measures you decided in advance, assess how successful the intervention has been. Has the situation that triggered the process been improved? What else happened that you maybe didn’t expect? Reflect Think about what has been learned and discovered. What did we do that worked? Did this process suggest anything that we need to investigate further? What aspects of the intervention can be maintained? What changes will we make to our practices? What support will we need?
28
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? The same list as in the previous slide, with brief explanations. It’s useful to think of the cycle as having sequential stages: Trigger Data, ideas, hunches, etc set a process in action. The trigger is whatever it is that makes you think there could be an opportunity to improve student achievement. You can routinely scan available data looking for inconsistencies, etc. It can be useful to speculate about possible causes or effects - and then explore data and other evidence to see if there are any grounds for the speculation. Explore Initial data, ideas or hunches usually need some preliminary exploration to pinpoint the issue and suggest good questions to ask. Question This is the key point: what question/s do you want answered. Questions can raise an issue and/or propose a possible solution. Assemble Get together all the data and evidence you might need - some will already exist and some will have to be generated for the occasion. Analyse Process sets of data and relate them to other evidence. You are looking for trends and results that will answer your questions (but watch out for unexpected results that might suggest a new question). Interpret Think about the results of the analysis and clarify the knowledge and insights you think you have gained. Interrogate the information. It’s important to look at the information critically. Was the data valid and reliable enough to lead you to firm conclusions? Do the results really mean what they seems to mean? How sure are you about the outcome? What aspects of the information lead to possible action? Intervene Design and implement a plan of action designed to change the situation you started with. Be sure that your actions are manageable and look at the resourcing needed. Consider how you’ll know what has been achieved. Evaluate Using measures you decided in advance, assess how successful the intervention has been. Has the situation that triggered the process been improved? What else happened that you maybe didn’t expect? Reflect Think about what has been learned and discovered. What did we do that worked? Did this process suggest anything that we need to investigate further? What aspects of the intervention can be maintained? What changes will we make to our practices? What support will we need?
29
The evidence-driven decision making cycle
Explore Check data and evidence to explore the issue Question Clarify the issue and ask a question Assemble Decide what data and evidence might be useful Analyse data and evidence Trigger Data indicate a possible issue that could impact on student achievement Evaluate the impact on the intervention Intervene Plan an action aimed at improving student achievement Interpret Insights that answer your question Speculate A teacher has a hunch about a problem or a possible action Act Carry out the intervention Reflect on what has been learned, how practice will change The same stages shown as a cycle. The length of the cycle will vary for different situations. We might wait a year to evaluate the effects of our actions - but sometimes we’ll be able to (and ought to) work to shorter (or maybe longer) cycles. It’s important that we reflect, evaluate and make professional judgements at each stage of this cycle. Now we will invent scenarios that might apply in our school (or department). Do this in groups. Draw up your scenario as a cycle as in this slide. The next slide provides a blank template for this exercise. You might like to photocopy this template for groups to fill in. A sample scenario is given in the following slide.
30
The evidence-driven decision making cycle
EXPLORE QUESTION ASSEMBLE ANALYSE TRIGGER EVALUATE INTERVENE INTERPRET SPECULATE ACT REFLECT This is a blank template for the exercise suggested with the previous slide. You might like to photocopy this template for groups to fill in. A sample scenario is given in the following slide.
31
The evidence-driven decision making cycle
Explore data Survey of students shows that this is only partially true Question What are the characteristics of students who are poor at writing? Assemble more data & other evidence: asTTle reading, homework, extracurric, attendance, etc. Analyse NQF/NCEA results by standard Trigger Significant numbers not achieving well in writing Analyse non NQF/NCEA data and evidence Intervene Create multiple opportunities for writing; include topics that can use sport as context; connect speaking and writing. PD for staff. Interpret information Poor writers likely to play sport, speak well, read less, do little HW A teacher has a hunch - poor writers might spend little time on homework Evaluate Has writing improved? Reflect How will we teach writing in the future? . An sample scenario for the activity in the previous slide. You might prefer to print handout copies.
32
Evaluate and reflect Summative evaluation – assess how successful the intervention was; decide how our practice will change; report to board Formative evaluation – at every stage in the cycle we reflect and evaluate Are we are on the right track? Do we need to fine-tune? Do we actually need to complete this? This is a reminder that could be used at many points in the presentation. We should pause here and think about evaluation and reflection. The final stage in this cycle is summative evaluation - we assess how successful the whole process was and reflect on whether we will change our future practice. But at every stage in this cycle we should be reflecting, evaluating in a formative way and making professional judgements about where to go next. We need to be sure that we are on the right track. Should we fine-tune the process as we go? Many schools are consciously developing a ‘culture of inquiry’ – an open and supportive environment in which staff and the school community regularly reflect on the way the school operates, one in which calculated risk-taking is seen as an essential ingredient of innovation. A cyclical improvement process is iterative – incremental changes are incorporated into the knowledge base and into professional practice and feed into the next cycle. There is a compounding effect - change becomes the trigger for more questions. This resource can be used as a contribution to that approach.
33
Types of analysis We can compare achievement data by subject or across subjects for an individual student groups of students whole cohorts The type of analysis we use depends on the question we want to answer The following slides explain three of the most common types of analysis used in schools: Inter subject analysis Intra subject analysis Longitudinal analysis The aim here is to make a start on thinking about what you can do with data. It’s not a theory lesson. Teachers will be familiar with these three types of data analysis – we introduce them through discussing questions because asking the right questions is a major theme of this resource. These examples use only student achievement data in order to highlight the different approaches. The way we analyse data depends on the question we are trying to answer … let’s look at three examples.
34
Inter-subject analysis
Have my students not achieved a particular history standard because they have poor formal writing skills, rather than poor history knowledge? We can explore this question this by cross referencing the history results of individual students with their formal writing results in English. If the trend is for your students to do less well in formal writing than in other aspects of English and/or other aspect of history, the answer could be Yes.
35
Intra-subject analysis
What are the areas of strength and weakness in my own teaching of this class? We could compare the externally assessed NCEA results of our students with results from appropriately matched schools using published national data. Where these differences are greater or less than average, areas of strength/weakness may exist. For example, we might find that, on average, your students have gained credits at a rate 5 percentage points better than the comparison schools. But in one standard the difference is 15 points, indicating a possible area of strength. In another standard, there is zero difference, indicating a possible area of weakness. This may be a good time to discuss NCEA data. It has been described as ‘both rich and subtle’’. Your school’s NCEA data gives you access to a huge amount of fine-grained information. You can also aggregate NCEA data to show trends, etc – but you need to be careful that in aggregating the data you don’t lose the subtlety and even produce misleading information.
36
Longitudinal analysis
Are we producing better results over time in year 11 biology? We can compare NCEA Biology results for successive year 11 cohorts at this school with the national cohorts for successive years. But the results for the national cohorts might be improving too. So we need to work out how the school’s cohort differs from the national cohort in each year. If the school’s rate of change is better than the changes for the national cohort, the answer could be Yes. To be sure that our teaching is producing this improvement, we should look at the overall levels of achievement of each cohort – otherwise, any improvement could be a result of more able students, rather than better teaching. You could discuss longitudinal analysis of student performance. That would require comparable assessments over successive years – NCEA results at successive levels probably would not suffice for this.
37
The evidence-driven decision making cycle
> Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide will reappear throughout (with each step in > bold) to identify what stage of the cycle you are at. Let’s start now at the beginning of the cycle – what triggers the process?
38
Asking questions Evidence-driven decision making starts with asking good questions You can tell whether a man is clever by his answers. You can tell whether he is wise by his questions. Nobel Prize winner, Naguib Mahfouz Questions are the major theme of this resource. There are probably too many slides here for you to use, but it’s worth spending some time exploring how to get your question right before you start on any data analysis. The quality of any analysis depends on the quality of the question we want answered. So it’s worth spending some time thinking about getting our questions right. Before we start any analysis, we need to write down the questions we want to answer. The questions we ask will determine the selection of evidence and how we analysis it. Where will our questions come from?
39
Formative or summative?
Trigger questions How good/poor is …? What aspects of … are good/poor? Is … actually changing? How is … changing? Is … better than last year? How can … be improved? Why is … good/poor? What targets are reasonable for …? What factors influence the situation for …? What would happen if we …? Formative or summative? Questions that trigger the process can relate to student achievement or behaviour, teaching approaches and school processes - like the ones on this slide. Questions can be described as summative or formative. Let’s think about the questions on this slide - which of them are summative and which are formative. You could have a discussion about the purpose of summative and formative questions – when is each type of question useful? Summative questions give us end-of-process results, often suitable for reporting and accountability. Formative questions are intended to provide more immediate feedback to improve teaching and learning, so they are probably more specific.
40
Summative questions A target in the school’s annual plan is for all year 10 boys to improve their writing level by at least one level using asTTle (e.g. from 4B to 4A). Have all year 10 boys improved by at least one asTTle level in writing? This slide is about using data and other evidence for strategic planning, as mentioned in earlier slides. Many summative questions will relate to the goals and targets in our school’s strategic plan for improving student achievement - or to goals set within teaching departments or faculties. Most of these questions will be obvious and decided in advance - especially if our strategic goals have been decided on the basis of analysing data. Of course, achieving targets like this relies on deciding how to improve how we teach writing – that’s what formative questions are all about …
41
Questions about policy
We have been running 60-minute periods for 5 years now. What effect has the change had? Many of our questions will be aimed at evaluating a particular policy or procedure, especially if there has been a change.
42
Formative questions from data
The data suggest our students are achieving well in A, but less well in B. What can we do about that? Some questions will come from our routine consideration of available data and evidence or as a consequence of internal or external reviews: national assessment results, asTTle results for a cohort, information from contributing schools, ERO reports, etc.
43
Formative questions from data
A significant proportion of our school leavers enrol in vocational programmes at polytechnic or on-job. How well do our school programmes prepare those students? Some questions will come from our consideration of data and evidence about student destinations. This question should lead us to look back at our school processes data.
44
Questions from hunches
I suspect this poor performance is being caused by … Is this true? We reckon results will improve if we put more effort into ... Is this likely? I think we’d get better results from this module if we added … Is there any evidence to support this idea? The colloquial term hunch is used here to recognise how intuitive teachers can be. The aim is not to belittle hunches. They are extremely useful. In fact most hunches are based on sound professional experience and observation. But until they have been tested against evidence, they remain hunches. In terms of improving particular aspects of teaching and learning, many of the most pertinent questions come from our ‘hunches’. Some of our hunches will be based on a hypothesis or a speculation about a possible change. We’re using the casual term hunch here, but this does not belittle hunches. Teachers, like detectives, base hunches on professional observations – it’s just that you haven’t yet tested them against some evidence.
45
Hunches from raw data This slide is simply to point out that teachers routinely look at raw data and start to get hunches. We return to this table in the Analyse section of the presentation. This table is a set of results for year 12 students in one subject. Most of us would get hunches about trends and issues from a quick read of raw data like this. What trends and issues can we get from just scanning this table? Until careful analysis is done, it’s best to pose these hunches as questions. Some questions are suggested in the next slide.
46
Hunches from raw data Is the class as a whole doing better in internally assessed standards than in externally assessed standards? If so, why? Are the better students (with many Excellence results) not doing as well in external assessments as in internal? If so, why? Is there any relationship between absences and achievement levels? It seems not, but it’s worth analysing the data to be sure. Hunches (expressed as questions) from the previous slide. We return to this table in the Analyse section of the presentation.
47
The evidence-driven decision making cycle
Trigger Clues found in data, hunches > Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? Before we rush into analysing evidence based on a hunch, we need to pause and explore – we should ask ourselves: Is there really an issue here? Do we really know what the issue is?
48
Question – Explore – Question
It looks like our students are doing well in A but not in B. What can we do about it? EXPLORE … what else should we be asking? Is this actually the case? Is there anything in the data to suggest what we could do about it? Our initial question may be general and tentative – often based on a quick look at data. Additional evidence and professional judgement may be needed to be sure that we are onto something. This should lead us to new questions. As we start to explore data and other evidence new questions will arise. In this case, we should do some preliminary analysis of the data to be sure that our impression is justified – then we can look again at the data for possible solutions. Or we can base our actions on as hunch (as in the previous slide). Take the questions from hunches in the previous slide and turn them into multi-barrelled questions that require preliminary exploration before a specifically targeted question is finalised. Another possible multi-barrelled question is in the next slide.
49
Question – Explore – Question
We have been running 60-minute periods for a year now. Did the change achieve the desired effects? EXPLORE … what else should we be asking? How has the change impacted on student achievement? Has the change has had other effects? Is there more truancy? Is more time being spent in class on assignments, rather than as homework? Here’s another example of how we should explore questions before we start to look for evidence. The assumption here is that the change to 60 minute periods was intended to have specific results .
50
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? > Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This is the most crucial stage – to make good evidence-based decisions we need to ask useful questions.
51
A very good question Specific and with a clear purpose
Able to be investigated through looking at data and other evidence Likely to lead to information on which we can act A very good question in evidence-driven decision making is one that will ultimately help us to make a good decision.
52
Questions with purpose
What do we know about reported bullying incidents for year 10 students? MAY BE BETTER AS Who has been bullying whom? Where? What are students telling us? What does pastoral care data tell us? Were some interventions more effective with some groups of students than others? The initial question can be answered quite easily but what use will the answer be? More purposeful questions are likely to lead to information we can act on.
53
Write more purposeful questions
What are the attendance rates for year 11 students? What has been the effect of the new 6-day x 50-min period structure? How well are boys performing in formal writing in year 9? What has been the effect of shifting the lunch break to after period 4? Consider the questions marked in this slide. Write more purposeful questions. You might use just one or two of these – or divide into groups. Some more purposeful questions are offered on the next slide.
54
More purposeful questions
How do year 11 attendance rates compare with other year levels? Do any identifiable groups of year 11 students attend less regularly than average? Is the new 6-day x 50-min period structure having any positive effect on student engagement levels? Is it influencing attendance patterns? What do students say? Should we be concerned about boys’ writing? If so, what action should we be taking to improve the writing of boys in terms of the literacy requirements for NCEA Level 1? The new timing of the lunch break was intended to improve student engagement levels after lunch. Did it achieve this? If so, did improvements in student engagement improve student achievement? Do the benefits outweigh any disadvantages? Some more purposeful questions from the previous slide.
55
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? > Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? Once we have a very good question, we need to consider what data and other evidence will help us answer it.
56
Assembling the evidence
We want to know if our senior students are doing better in one area of NCEA biology than another. So … we need NCEA results for our cohort. It could be that all biology students do better in this area than others. So … we also need data about national differences across the two areas. Often the data we need will be obvious. But we need to make sure that we have looked at all angles, to ensure we have all the data we need to draw valid conclusions.
57
Are our data any good? A school found that a set of asTTle scores indicated that almost all students were achieving at lower levels than earlier in the year. Then they discovered that the first test had been conducted in the morning, but the later test was in the afternoon and soon after the students had sat a two-hour exam. We should always think critically about the available data and other evidence before we decide to analyse it. If we find the results of a test a bit surprising, we should look closely at the test itself - was it set at an appropriate level for that group of students? Ask questions about how tests were administered. This might seem an unlikely scenario, but it apparently happened.
58
Think critically about data
Was the assessment that created this data assessing exactly what we are looking for? Was the assessment set at an appropriate level for this group of students? Was the assessment properly administered? Are we comparing data for matched groups? These questions are really about validity and reliability. We can make commonsense judgements about whether data is valid for our purposes and whether it was created under conditions we can rely on.
59
Cautionary tale 1 You want to look at changes in a cohort’s asTTle writing levels over 12 months. Was the assessment conducted at the same time both years? Was it administered under the same conditions? Has there been high turnover in the cohort? If so, will it be valid to compare results? We should always ask if there are data-related factors that might have a bearing on the issue in question.
60
What could have caused this?
Cautionary tale 2 You have data that show two classes have comparable mathematics ability. But end-of-year assessments show one class achieved far better than the other. What could have caused this? Was the original data flawed? How did teaching methods differ? Was the timetable a factor? Did you survey student views? Are the classes comparable in terms of attendance, etc? We need to take care when we think we have matched groups.
61
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together > Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? We now move on to what we do with data and other evidence – what we normally think of as ‘data analysis’.
62
Analysing data and other evidence
Schools need some staff members who are responsible for leading data analysis Schools have access to electronic tools to process data into graphs and tables All teachers do data analysis Data is not an end in itself – it’s one of the many stages along the way to evidence-driven decision making The intent here is to stress that data analysis is just one step in a process (not even the most crucial one) and that every teacher does out some data analysis. All schools have access to tools that will generate, for example, gender and ethnicity comparisons, intra subject analyses, comparisons with national results.
63
Basic analysis All teachers do data some analysis.
This set of results for year 12 students in one subject can be ‘cut’ in many ways, even without access to statistical tools. Suggest some ways we could analyses these data without using statistical tools. Some ways to cut this data are suggested in the next slide.
64
Basic analysis Divide the class into three groups on the basis of overall achievement Identify students who are doing so well at level 2 that they could be working at a higher level Find trends for males and females, those who are absent often, or have many detentions Compare this group’s external assessment success rate with the national cohort. Some ways to cut the data from the previous slide.
65
Reading levels – terms 1 and 4
. This table is a simple example of an application of an analysis tool that any teacher could use - it shows how reading levels have changed within the school year. Any teacher could interpret the information in the table. But (as we will see later) we need to be careful about the conclusions we reach. This example uses asTTle results – level 5B is higher than level 4B. It should not be a problem if teachers are not familiar with asTTle
66
Making sense of the results
Think about significance and confidence How significant are any apparent trends? How much confidence can we have in the information? Even if we don’t fully understand how the analysis was carried out, we should still think critically about the results. Ask the person who did the analysis what has been done, what they think the results show, what limitations they would place on the results. Significance and confidence are key issues.
67
Making sense of the results
This table shows that reading levels overall were higher in term 4 than in term 1. Scores improved for most students. 20% of students moved into level 5. But the median score is still 4A. You might need to flick back to the previous slide to discuss this slide. This is useful summative information for reporting against targets and as general feedback to teachers. But it’s not information we could act on. In fact, is it actually information? In a sense, this is still data - a graphical representation of two sets of related data. Is this information? Can we act on it?
68
Information Knowledge gained from analysing data and making meaning from evidence. Information is knowledge (or understanding) that can inform your decisions. How certain you will be about this knowledge depends on a number of factors: where your data came from, how reliable it was, how rigorous your analysis was. So the information you get from analysing data could be a conclusion, a trend, a possibility. This is a slide from the Terminology section. You need to decide how much it’s worth discussing this. Let’s think about what counts as information.
69
Information Summative information is useful for reporting against targets and as general feedback to teachers. Formative information is information we can act on – it informs decision-making that can improve learning. This slide looks at the same distinctions made earlier about evaluation and questions.
70
Questions to elicit information
Did the more able students make significant progress, but not the lower quartile? How have the scores of individual students changed? How many remain on the same level? How much have our teaching approaches contributed to this result? How much of this shift in scores is due to students’ predictable progress? Is there any data that will enable us to compare our students with a national cohort? How does this shift compare with previous Year 9 cohorts? This slide refers to the asTTle graph shown earlier – this graph is repeated in the next slide. To get formative information from the asTTle graph to inform decision-making we’d need to go further. We need to ask more questions. You can’t answer these questions from the graph. We’d need to disaggregate the data. Then we’ll have information we can act on. (In fact, this graph has been created by aggregating separate asTTle results - so you’d probably go back to the original data and aggregate it in different ways.)
71
Reading levels – terms 1 and 4
This slide refers to the questions on the previous slide. To get formative information from the asTTle graph to inform decision-making we’d need to go further. We need to ask more questions. We’d need to disaggregate the data. Then we’ll have information we can act on. (In fact, this graph has been created by aggregating separate asTTle results - so you’d probably go back to the original data and aggregate it in different ways.)
72
Words, words, words … Information can … establish, indicate, confirm, reinforce, back up, stress, highlight, state, imply, suggest, hint at, cast doubt on, refute … Does this confirm that …? What does this suggest? What are the implications of …? How confident are we about this conclusion? It’s worth thinking about precisely how we express information, especially in relation to the question that triggered the analysis. The verbs we choose reflect the confidence we have in the information. You can even see a hierarchy. When we interrogate the information we use even more questions. The answers will often come from our professional judgement.
73
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence > Interpret What information do we have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? The analysis has been completed – but what information do we have. In fact, what do we mean by ‘information’?
74
Making sense of information
Data becomes information when it is categorised, analysed, summarised and placed in context. Information therefore is data endowed with relevance and purpose. Information is developed into knowledge when it is used to make comparisons, assess consequences, establish connections and engage in dialogue. Knowledge … can be seen as information that comes laden with experience, judgment, intuition and values. Empson (1999) cited in Mason (2003)
75
Interrogate the information
Is this the sort of result we envisaged? If not, why? How does this information compare with the results of other research or the experiences of other schools? Are there other variables that could account for this result? Should we set this information alongside other data or evidence to give us richer information? What new questions arise from this information? There are two slides showing questions we could ask of the information. Then there is a slide containing an activity . You might prefer to do the activity first. We need to interrogate the information generated by analysing data and evidence. First, we think about how confident we are about the result, by asking questions like this.
76
Interrogate the information
Does this relate to student achievement - or does it actually tell us something about our teaching practices? Does this information suggest that the school’s strategic goals and targets are realistic and achievable? If not, how should they change, or should we change? Does the information suggest we need to modify programmes or design different programmes? Does the information suggest changes need to be made to school systems? There are two slides showing questions we could ask of the information. Then there is a slide containing an activity . You might prefer to do the activity first. We should ask questions like this.
77
Interrogate the information
What effect is the new 6-day x 50-min period structure having on student engagement levels? Imagine we asked this question. When we see the information resulting from data analysis, what questions might we ask? This is an activity to demonstrate the sort of questioning covered in the two previous slides. Some suggestions are offered in the next slide. You could use just the next slide and omit the activity.
78
Interrogate the information
What effect is the new 6-day x 50-min period structure having on student engagement levels? Do student views align with staff views? Do positive effects outweigh negative effects? Is there justification for reviewing the policy? Does the information imply changes need to be made to teaching practices or techniques? Does the information offer any hint about what sort of changes might work? Some suggestions for the activity in the previous slide.
79
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? > Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? We have asked questions, analysed data another evidence and have some information we can act on. Now we decide what we will change.
80
Professionals making decisions
How do we decide what action to take as result of the information we get from the analysis? We use our professional judgment. This is an introductory slide – the aim is simply to point out that the analysis will not always point to an obvious intervention. Teachers still need to make professional decisions. What we decide to do as result of the information we get from the analysis is guided by our professional experience.
81
Professional decision making
We have evidence-based information that we see as reliable and valid What do we do about it? If the information indicates a need for action, we use our collective experience to make a professional decision What we decide to do as result of the information we get from the analysis is guided by our professional experience.
82
Professionals making decisions
Have my students not achieved a particular history standard because they have poor formal writing skills, rather than poor history knowledge? The answer was ‘Yes’ ... so I need to think about how to improve their writing skills. How will I do that? First, we go back to our trigger point - to the question we asked that led to the analysis. In this case, the question was quite simple and the information convincing – so what we have to achieved has been established. Exactly how we do that is the professional decision.
83
Professionals making decisions
Do any particular groups of year 11 students attend less regularly than average for the whole cohort? The analysis identified two groups – so I need to think about how to deal with irregular attendance for each group. How will I do that? Another example: The question we asked that led to the analysis was simple enough and the resulting information identifies where the action should be applied. Exactly what we do is the professional decision.
84
Professionals making decisions
You asked what factors are related to poor student performance in formal writing. The analysis suggested that poor homework habits have a significant impact on student writing. You make some professional judgements and decide Students who do little homework don’t write enough You could take action to improve homework habits - but you’ve tried that before and the success rate is low You have more control over other factors – like how much time you give students to write in class So you conclude – the real need is to get students to write more often Teachers decided the poor performance in writing was not homework habits but the total amount of writing students do. Professional judgments lead to the conclusion that the action required is to ensure that students do more writing. NOTE – This resource does not provide advice on writing an action plan – go to the Consider the Evidence web pages for resources on writing action plans.
85
Deciding on an action Information will often suggest a number of options for action. How do we decide which action to choose? We need to consider what control we have over the action the likely impact of the action the resources needed These are the factors we need to consider when we are planning an intervention: Control - What aspects of the situation do we have most control over? Do we run a limited pilot rather than a full scale intervention. Impact - What can we do that is most likely to have the desired impact? Do we play it safe and intervene only where we know we can make a major difference? Resources - time, money, people - What will we need? What do we have? What other activities could be affected if we divert resources to this project?
86
Planning for action Is this a major change to policy or processes?
What other changes are being proposed How soon can you make this change? How will you achieve wide buy-in? What time and resources will you need? Who will co-ordinate and monitor implementation? More factors to consider when we are planning an intervention.
87
Planning for action Is this an incremental change? Or are you just tweaking how you do things? How will you fit the change into your regular work? When can you start the intervention? Will you need extra resources? How will this change affect other things you do? How will you monitor implementation? More factors to consider when we are planning an intervention.
88
Timing is all How long should we run the intervention before we evaluate it? When is the best time of the year to start (and finish) in terms of measuring changes in student achievement? How much preparation time will we need to get maximum benefit? Remember that we are carrying out this action to see what impact it has. If our evaluation of the impact is to be meaningful, when we do it could be crucial.
89
Planning for evaluation
We are carrying out this action to see what impact it has on student achievement We need to decide exactly how we’ll know how successful the intervention has been To do this we will need good baseline data We need to decide in advance how we will evaluate the impact. To do this we will need valid and reliable baseline evidence. The baseline evidence we need for evaluation might be different from the data we analysed earlier in this project.
90
Planning for evaluation
What evidence do we need to collect before we start? Do we need to collect evidence along the way, or just at the end? How can we be sure that any assessment at the end of the process will be comparable with assessment at the outset? How will we monitor any unintended effects? Don’t forget evidence such as timetables, student opinions, teacher observations … Some questions to consider as we think about the data we will need to evaluate the impact of our intervention.
91
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action > Evaluate What was the impact? Reflect What will we change? We now move to the final stages of the processes – we have carried out the intervention and evaluated its impact against baseline data.
92
Evaluate the impact of our action
Did the intervention improve the situation that triggered the process? If the aim was to improve student achievement, did that happen? Now we have to decide how effective the intervention has been. These are the central questions.
93
Evaluate the impact of our action
Was any change in student achievement significant? What else happened that we didn’t expect? How do our results compare with other similar studies we can find? Does the result give us the confidence to make the change permanent? When we evaluate the impact of our intervention, we need to ask the same sort of questions that we asked earlier in the process. That is, we need to interrogate our evaluation. We must not leap to conclusions. The final question is the crucial one – has the intervention been effective enough to justify incorporating the change into our normal practice?
94
Evaluate the impact of our action
A school created a new year 13 art programme. In the past students had been offered standard design and painting programmes, internally and externally assessed against the full range of achievement standards. Some students had to produce two folios for assessment and were unsure of where to take their art after leaving school. The new programme blended drawing, design and painting concepts and focused on electronic media. Assessment was against internally assessed standards only. Where we simply change our teaching approach, we’ll compare student achievement data to evaluate the impact of the change. But more radical changes are more difficult to evaluate - a wider range of evaluation approaches will be needed. How could the school evaluate the impact of this change? Suggestions on the next slide.
95
Evaluate the impact of our action
Did students complete more assessments? Were students gain more national assessment credits? How did student perceptions of workload and satisfaction compare with teacher perceptions from the previous year? Did students leave school with clearer intentions about where to go next with their art than the previous cohort? How did teachers and parents feel about the change? Some suggestions from the previous slide. Bullet 2: If the school had decided on the change earlier they could have collected student perceptions evidence from the previous cohort. Bullet 3: For this measure the school was able to collect evidence from the previous cohort.
96
Evaluate the intervention
How well did we design and carry out the intervention? Would we do anything differently if we did it again? Were our results affected by anything that happened during the intervention period - within or beyond our control? Did we ask the right question in the first place? How useful was our question? How adequate were our evaluation data? Whether the answer to the final question on the previous slide (Does the result give us the confidence to make the change permanent?) is Yes or No, we should think about the intervention itself. These questions apply particularly if your intervention seems not to have worked – maybe there was a reason for that. Maybe it was not a complete waste of time.
97
Think about the process
Did we ask the right question in the first place? How useful was our question? Did we select the right data? Could we have used other evidence? Did the intervention work well? Could we have done anything differently? Did we interpret the data-based information correctly? How adequate were our evaluation data? Did the outcome justify the effort we put into it? Before we move on, we should look back over the process to see what we can learn from it.
98
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? > Reflect What will we change? This is the final step: we have trialed a change to our practice – how much of that can we embed in future practice?
99
Future practice What aspects of the intervention will we embed in future practice? What aspects of the intervention will have the greatest impact? What aspects of the intervention can we maintain over time? What changes can we build into the way we do things in our school? Would there be any side-effects? Even if things didn’t go exactly as we planned them – even if student achievement wasn’t greatly improved - there are probably some things we have learnt that we should incorporate into future practice. We need to be realistic about what we can achieve – we need to be sure we could maintain the intervention. We should also think about any side-effects – if we put time and effort into this change, will anything else suffer?
100
Future directions What professional learning is needed? Who would most benefit from it? Do we have the expertise we need in-house or do we need external help? What other resources do we need? What disadvantages could there be? When will we evaluate this change again? Before we decide to go ahead and embed aspects of the intervention into our practice, we need to look at all ramifications – and about starting the cycle all over again …. End of presentation. The next step should be for the group to discuss how this model can be applied in this school, department or faculty. This should include a discussion about what evidence already exists in the school, how this is collected and recorded, and how well equipped the school is to analyse and use it in the interests of improving student achievement. Finally, the group should consider what evidence-driven projects the school could undertake. Beware of collecting data and other evidence that you might not need. If the school thinks ahead about how to make evidence-based decisions, you will know what data and other evidence you should collect.
101
Consider the Evidence Terminology
102
Terminology Terminology used in the
evidence-driven decision making cycle Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? The following slides discuss and define each term.
103
Trigger Data, ideas, hunches, etc that set a process in action.
The trigger is whatever it is that makes you think there could be an opportunity to improve student achievement. You can routinely scan available data looking for inconsistencies, etc. It can be useful to speculate about possible causes or effects - and then explore data and other evidence to see if there are any grounds for the speculation. .
104
Explore Initial data, ideas or hunches usually need some preliminary exploration to pinpoint the issue and suggest good questions to ask. .
105
Question This is the key point: what question/s do you want answered. Questions can raise an issue and/or propose a possible solution. .
106
Assemble Get together all the data and evidence you might need – some will already exist and some will have to be generated for the occasion. .
107
Analyse Process sets of data and relate them to other evidence.
You are looking for trends and results that will answer your questions (but watch out for unexpected results that might suggest a new question). .
108
Interpret Think about the results of the analysis and clarify the knowledge and insights you think you have gained. Interrogate the information. It’s important to look at the information critically. Was the data valid and reliable enough to lead you to firm conclusions? Do the results really mean what they seems to mean? How sure are you about the outcome? What aspects of the information lead to possible action? .
109
Intervene Design and implement a plan of action designed to change the situation you started with. Be sure that your actions are manageable and look at the resourcing needed. Consider how you’ll know what has been achieved. .
110
Evaluate Using measures you decided in advance, assess how successful the intervention has been. Has the situation that triggered the process been improved? What else happened that you maybe didn’t expect?
111
Reflect Think about what has been learned and discovered – and what practices you will change as a consequence. What did we do that worked? Did this process suggest anything that we need to investigate further? What aspects of the intervention can be maintained? What support will we need? .
112
Terminology Other terms used in Consider the Evidence .
113
Terminology Analysis A detailed examination of data and evidence intended to answer a question or reveal something. This simplistic definition is intended to point out that data analysis is not just about crunching numbers - it’s about looking at data and other evidence in a purposeful way, applying logic, creativity and critical thinking to see if you can find answers to your questions or reveal a need. For example, you can carry out a statistical analysis of national assessment results in the various strands of English across all classes at the same level. You could compare those results with attendance patterns. But you might also think about those results in relation to more subjective evidence - such as how each teacher rates his/her strengths in teaching the various strands. .
114
Terminology Aggregation A number of measures made into one.
This is a common and important concept in dealing with data. A single score for a test that contains more than one question is an aggregation - two or more results have been added to get a single result. Aggregation is useful when you have too few data to create a robust measure or you want to gain an overview of a situation. But aggregation can blur distinctions that could be informative. So you will often want to disaggregate some data – to take data apart to see what you can discover from the component parts. For example, a student may do moderately well across a whole subject, but you need to disaggregate the year’s result to see where her weaknesses lie. .
115
Terminology Data Known facts or measurements, probably expressed in some systematic or symbolic way (eg as numbers). Data are codified evidence. (The word is used as a plural noun in this kit.) The concepts of validity and reliability apply to data. It helps to know where particular data came from; how data were collected and maybe processed before you received them. Some data (eg attendance figures) will come from a known source that you have control of and feel you understand and can rely on. Other data (eg standardised test results) come from a source you might not really understand; they may be subject to manipulation and predetermined criteria or processes (like standards or scaling). Some data (eg personality profiles) may be presented as if they are sourced in an objective way but their reliability might be variable. .
116
Terminology Demographics
Data relating to characteristics of groups within the school’s population. Data that provides a profile of people at your school. You will have the usual data relating to your students (gender, ethnicity, etc) and your staff (gender, ethnicity, years of experience, etc). Some schools collect other data, such as the residential distribution of students and parental occupations. .
117
Terminology Disaggregation See aggregation
When you disaggregate data, you take aggregated data apart to see what you can discover from the component parts. For example, a student may do moderately well across a whole subject, but you need to disaggregate the year’s result to see where her weaknesses lie. .
118
Terminology Evaluation
Any process of reviewing or making a judgement about a process or situation. In this resource, evaluation is used in two different but related ways. After you have analysed data and taken action to change a situation, you will carry out an evaluation to see how successful you have been - this is summative evaluation. But you are also encouraged to evaluate at every step of the way - when you select data, when you decide on questions, when you consider the results of data analysis, when you decide what actions to take on the basis of the data - this is called formative evaluation. .
119
Terminology Evidence Any facts, circumstances or perceptions that can be used as an input for an analysis or decision. For example, the way classes are compiled, how a timetable is structured, how classes are allocated to teachers, student portfolios of work, student opinions. These are not data, because they are not coded as numbers, but they can be factors in shaping teaching and learning and should be taken into account whenever you analyse data and when you decide on action that could improve student achievement. .
120
Terminology Information
Knowledge gained from analysing data and making meaning from evidence. Information is knowledge (or understanding) that can inform your decisions. How certain you will be about this knowledge depends on a number of factors: where your data came from, how reliable it was, how rigorous your analysis was. So the information you get from analysing data could be a conclusion, a trend, a possibility. .
121
Terminology Inter-subject analysis
A detailed examination of data and evidence gathered from more than one learning area. Inter subject analysis can answer questions or reveal trends about students or teaching practices that are common to more than one learning area. For example, analysing the results of students taking mathematics and physics subjects can indicate the extent to which achievements in physics are aided or impeded by the students’ mathematical skills. .
122
Terminology Intervention
Any action that you take to change a situation, generally following an analysis of data and evidence. This term is useful as it emphasises that to change students’ achievement, you will have to change something about the situation that lies behind achievement or non-achievement. You will take action to interrupt the status quo. .
123
Terminology Intra-subject analysis
A detailed examination of data and other evidence gathered from within a specific learning area. Intra subject analysis can answer questions or reveal trends about student achievement or teaching within a subject or learning area. For example, an analysis of assessment results for all students studying a particular subject in a school can reveal areas of strength and weakness in student achievement and/or in teaching practices, etc. Comparison of a school’s results in a subject with results in that subject in other schools is also intra subject analysis. .
124
Terminology Longitudinal analysis
A detailed examination of data and evidence to reveal trends over time. Longitudinal analysis in education is generally used to reveal patterns in student achievement, behaviour, etc over a number of years. Results can reveal the relative impact of different learning environments, for example. In this resource, it is suggested that longitudinal analysis can be applied to teaching practice and school processes. For example, the impact of modified teaching practices in a subject over a number of years can be evaluated by analysing the achievements of successive cohorts of students. .
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.