Download presentation
Presentation is loading. Please wait.
Published byDulcie Blankenship Modified over 6 years ago
1
Consider the Evidence Evidence-driven decision making
for secondary schools A resource to assist schools to review their use of data and other evidence 3
2
Evidence-driven decision making
Today we aim to think about a process for using data and other evidence to improve teaching, learning and student achievement improve our understanding, confidence and capability in using data to improve practice think about our needs and our own evidence-based projects This session will help us to think about how we can make decisions based on evidence in a structured and informed way. In a moment we’ll discuss what is meant by ‘data and other evidence’. We can apply today’s material to student achievement at all secondary levels - not just senior – and to all curriculum learning areas and school processes. First, we need to look at some of the terms being used here – so we are all speaking the same language. First: What is evidence-driven decision making? This resource does not justify the use of data analysis in schools – the assumption is that teachers and schools already understand the benefits and to some extent already do it. This resource does not provide data processing tools or directions on how to analyse data. These are readily available.
3
Evidence-driven eating
You need to buy lunch. Before you decide what to buy you consider a number of factors: how much money do you have? what do you feel like eating? what will you be having for dinner? how far do you need to go to buy food? how much time do you have? where are you going to eat it? This is a frivolous, non-educational scenario to start you thinking about how schools use evidence to make decisions. Don’t labour it. With many groups (especially experienced teachers) it will be inappropriate to use it at all. This scenario is just to get us started. There’s nothing mysterious about evidence-driven decision making. We all make decisions every day based on an analysis of a number of factors. In this scenario you’d analyse the factors and make a decision in seconds (or you’d go hungry). What other factors you might consider before buying lunch? For example: Who are you eating with? How much do you want to spend? What did you have for breakfast? How hungry are you? Are you on a special diet? What else do you need to do this lunchtime? Who do you want to avoid this lunchtime?
4
Evidence-driven teaching
I had a hunch that Ana wasn’t doing as well as she could in her research assignments, a major part of the history course. What made me think this? Ana’s general work (especially her writing) was fine. She made perceptive comments in class, contributed well in groups and had good results overall last year, especially in English. How did I decide what to do about it? The aim here is to demonstrate (and acknowledge) that teachers look for and use a variety of evidence as a normal part of effective and reflective teaching. The conclusions the teacher reaches are not as important here as the investigative approach he uses . Teachers continually consider what they know about students. This story told by Ana’s history teacher is typical. It’s not a story about a formal investigation. It’s just the sort of thing good teachers do all the time. It might have taken place over just a week or two and taken very little of the teacher’s time. This teacher had a ‘hunch’ based on his general professional observations. He informally compared a range of evidence to see if his hunch was correct. It was. He wanted to find a way to improve this one aspect of Ana’s achievement. He considered other evidence and analysed it. This enabled him to pinpoint the problem and plan a course of action designed to improve Ana’s achievement. This teacher was thinking about the data and other evidence he had right there in front of him - and then he acted on his conclusions. The teacher used evidence-driven decision making, using data and other evidence to inform his actions. In this session we want to see how to expand (and systematise) that sort of thinking to drive improvement across the school.
5
Evidence-driven teaching (cont…)
I looked more closely at her other work. I watched her working in the library one day to see if it was her reading, her use of resources, her note taking, her planning, or what. At morning tea I asked one of Ana’s other teachers about Ana’s approach to similar tasks. I asked Ana if she knew why her research results weren’t as good as her other results, and what her plans were for the next assignment. I thought about all of this and planned a course of action. I gave her help with using indexes, searching, note taking and planning and linking the various stages of her research. The aim here is to demonstrate (and acknowledge) that teachers look for and use a variety of evidence as a normal part of effective and reflective teaching. The conclusions the teacher reaches are not as important here as the investigative approach he uses . Teachers continually consider what they know about students. This story told by Ana’s history teacher is typical. It’s not a story about a formal investigation. It’s just the sort of thing good teachers do all the time. It might have taken place over just a week or two and taken very little of the teacher’s time. This teacher had a ‘hunch’ based on his general professional observations. He informally compared a range of evidence to see if his hunch was correct. It was. He wanted to find a way to improve this one aspect of Ana’s achievement. He considered other evidence and analysed it. This enabled him to pinpoint the problem and plan a course of action designed to improve Ana’s achievement. This teacher was thinking about the data and other evidence he had right there in front of him - and then he acted on his conclusions. The teacher used evidence-driven decision making, using data and other evidence to inform his actions. In this session we want to see how to expand (and systematise) that sort of thinking to drive improvement across the school.
6
What is meant by ‘data and other evidence’?
Consider the Evidence A resource to assist schools to review their use of data and other evidence What is meant by ‘data and other evidence’? This is an introductory slide – the next two slides discuss and define ‘evidence’ and ‘data’. This resource uses terms commonly used in schools - the usage and meaning intended is that generally applied in educational circles. A full discussion of terminology used in this resource is in the Appendix. The resource we are using is called ‘Consider the Evidence’ – it deals with ‘evidence-driven decision making’. So we need to think about what constitutes ‘evidence’ in schools. It’s useful to think in terms of ‘data and other evidence’.
7
Evidence Any facts, circumstances or perceptions that can be used as an input for an analysis or decision how classes are compiled, how classes are allocated to teachers, test results, teachers’ observations, attendance data, portfolios of work, student opinions … Data are one form of evidence ‘Evidence’ is used here in the same way that it’s used in courts of law and in standards based assessment. Like all schools, we have access to a lot of ‘data’ about student achievement and student behaviour – test results, attendance patterns, etc. But we have access to a lot more information than what is normally thought of as ‘data’. In this session we want to be aware of all the ‘evidence’ we have access to. Some of this evidence is ‘data’ - but some (like student opinions, teachers’ observations) can’t be easily processed in the way we process ‘data’ – so it’s best called ‘evidence’. If participants have concerns about the use of jargon, here’s a way to discuss the issue: Whenever people come to grips with new ideas, they might have to learn new terms or give special meaning to existing words. This happened with curriculum and assessment developments – but most teachers (and parents) are now familiar with terms and concepts like strands, levels and credits. The language of computing is another good example.
8
Data Known facts or measurements, probably expressed in some systematic or symbolic way (e.g. as numbers) assessment results, gender, attendance, ethnicity … Data are one form of evidence This resource treats the word ‘data’ as a plural noun – hence ‘data are …’. There’s nothing new here - but for some of you, this will be quite a narrow definition of data. We could have a discussion about what constitutes data – for example, do all data have to be coded in some way (eg as a number)? But I suggest we accept this distinction for the purposes of this session. The main point is: If we want to improve student achievement, we can look at a lot more than what we traditionally think of as data.
9
Evidence-driven decision making
We have more evidence about what students know and can do than ever before - their achievements, behaviours, environmental factors that influence learning We should draw on all our knowledge about the learning environment to improve student achievement explore what lies behind patterns of achievement decide what changes will make a difference This is a general introduction to evidence-driven decision making. It may not be necessary for groups that are already committed to this sort of approach. But groups intent on change should pause to consider how to minimise effort and risk. We all know that we can make significant improvements to teaching and learning by analysing data and applying what we learn from it. But an issue for many schools is that they have too much data in some areas – too much evidence – and not enough time and resources to use it all effectively. In other areas, we have too little evidence. Any change in teaching practice is a risk - you can never be entirely sure of the consequences. The approach we are looking at today shows us how to decide what we should change to improve student achievement. Evidence-driven decision making can help by reducing or assessing risk, and maybe by pointing out changes that have lesser risk.
10
What evidence does a school have?
Demographics Student achievement Perceptions School processes Other practice This is an introductory slide – the next five slides deal with each category in turn. All schools have data about student achievement. To make the most of these data, we need to take be aware of many other factors - evidence that describes our students’ wider learning environment. We have so much evidence that it’s useful to categorise it in some way. The approach we are taking today separates all data and other evidence into these five categories. If you have already done an exercise that uses data, you could use these categories to discuss what sort of evidence you used in the exercise – and what other evidence could be used to extend the example.
11
Demographics What data do we have now to provide a profile of our school? What other data could we create? School Students Staff Parents/caregivers and community Demographics – also known as Profile data – objective data that describe our school and its students, staff and community – decile, gender, suspensions, etc You should make the point that data and other evidence should be generated only if it’s for a purpose. The Consider the Evidence web pages and the full presentation provide examples for each bulleted point.
12
Student achievement What evidence do we have now about student achievement? What other evidence could we collect? National assessment results Standardised assessment results administered internally Other in-school assessments Student work Student Achievement data and other evidence - much of this is readily available – from national assessments, standardised testing we carry out in the school, portfolios of student work, etc. The Consider the Evidence web pages and the full presentation provide examples for each bulleted point.
13
Perceptions What evidence do we have now about what students, staff and others think about the school? Are there other potential sources? Self appraisal Formal and informal observations made by teachers Structured interactions Externally generated reports Student voice Other informal sources In many schools there will be little of this sort of evidence, so you might spend more time on this. Perceptions - evidence of what staff, students and other think about the school - probably the most subjective evidence but much of it will be factual and collected in formal ways - student self appraisal, formal and informal observations made by teachers, etc. The Consider the Evidence web pages and the full presentation provide examples for each bulleted point.
14
School processes What evidence do we have about how our school is organised and operates? Timetable Classes Resources Finance Staffing Some teachers may not think of School Processes as evidence that can be used in decision making – you might need to skip forward to later slides that provide examples. School processes - how our school is organised and operates – the timetable, resources, etc The Consider the Evidence web pages and the full presentation provide examples for each bulleted point.
15
Other practice How we can find out about what has worked in other schools? Documented research Experiences of other schools Other Practice – we should look at the experiences of others - documented academic research, the experiences of other schools, etc. We can access a lot of this material from Te Kete Ipurangi, the TKI website. The Consider the Evidence web pages and the full presentation provide examples for each bulleted point.
16
The evidence-driven decision making cycle
Trigger Explore Question Assemble Analyse Interpret Intervene Evaluate Reflect This slide introduces the stages of the evidence-driven decision making cycle. The next slide lists the headings again and gives simple explanations. The following two slides show these stages in cyclic diagrams. There is a slide for each heading in the Appendix section of this presentation, with the full explanation as in the script below. You might prefer to print handout copies of the text below or the cycle diagram in a later slide. It’s useful to think of an evidence-driven decision-making cycle as having sequential stages: Trigger Data, ideas, hunches, etc set a process in action. The trigger is whatever it is that makes you think there could be an opportunity to improve student achievement. You can routinely scan available data looking for inconsistencies, etc. It can be useful to speculate about possible causes or effects - and then explore data and other evidence to see if there are any grounds for the speculation. Explore Initial data, ideas or hunches usually need some preliminary exploration to pinpoint the issue and suggest good questions to ask. Question This is the key point: what question/s do you want answered. Questions can raise an issue and/or propose a possible solution. Assemble Get together all the data and evidence you might need - some will already exist and some will have to be generated for the occasion. Analyse Process sets of data and relate them to other evidence. You are looking for trends and results that will answer your questions (but watch out for unexpected results that might suggest a new question). Interpret Think about the results of the analysis and clarify the knowledge and insights you think you have gained. Interrogate the information. It’s important to look at the information critically. Was the data valid and reliable enough to lead you to firm conclusions? Do the results really mean what they seems to mean? How sure are you about the outcome? What aspects of the information lead to possible action? Intervene Design and implement a plan of action designed to change the situation you started with. Be sure that your actions are manageable and look at the resourcing needed. Consider how you’ll know what has been achieved. Evaluate Using measures you decided in advance, assess how successful the intervention has been. Has the situation that triggered the process been improved? What else happened that you maybe didn’t expect? Reflect Think about what has been learned and discovered. What did we do that worked? Did this process suggest anything that we need to investigate further? What aspects of the intervention can be maintained? What changes will we make to our practices? What support will we need?
17
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide repeats the stages in the cycle and gives simple explanations. The next two slides show these stages in cyclic diagrams. You might prefer to print handout copies of the text below or the cycle diagram in a later slide. It’s useful to think of the cycle as having sequential stages: Trigger Data, ideas, hunches, etc set a process in action. The trigger is whatever it is that makes you think there could be an opportunity to improve student achievement. You can routinely scan available data looking for inconsistencies, etc. It can be useful to speculate about possible causes or effects - and then explore data and other evidence to see if there are any grounds for the speculation. Explore Initial data, ideas or hunches usually need some preliminary exploration to pinpoint the issue and suggest good questions to ask. Question This is the key point: what question/s do you want answered. Questions can raise an issue and/or propose a possible solution. Assemble Get together all the data and evidence you might need - some will already exist and some will have to be generated for the occasion. Analyse Process sets of data and relate them to other evidence. You are looking for trends and results that will answer your questions (but watch out for unexpected results that might suggest a new question). Interpret Think about the results of the analysis and clarify the knowledge and insights you think you have gained. Interrogate the information. It’s important to look at the information critically. Was the data valid and reliable enough to lead you to firm conclusions? Do the results really mean what they seems to mean? How sure are you about the outcome? What aspects of the information lead to possible action? Intervene Design and implement a plan of action designed to change the situation you started with. Be sure that your actions are manageable and look at the resourcing needed. Consider how you’ll know what has been achieved. Evaluate Using measures you decided in advance, assess how successful the intervention has been. Has the situation that triggered the process been improved? What else happened that you maybe didn’t expect? Reflect Think about what has been learned and discovered. What did we do that worked? Did this process suggest anything that we need to investigate further? What aspects of the intervention can be maintained? What changes will we make to our practices? What support will we need?
18
The evidence-driven decision making cycle
Explore Check data and evidence to explore the issue Question Clarify the issue and ask a question Assemble Decide what data and evidence might be useful Analyse data and evidence Trigger Data indicate a possible issue that could impact on student achievement Evaluate the impact on the intervention Intervene Plan an action aimed at improving student achievement Interpret Insights that answer your question Speculate A teacher has a hunch about a problem or a possible action Act Carry out the intervention Reflect on what has been learned, how practice will change The length of the cycle will vary for different situations. We might wait a year to evaluate the effects of our actions - but sometimes we’ll be able to (and ought to) work to shorter (or maybe longer) cycles. It’s important that we reflect, evaluate and make professional judgements at each stage of this cycle. Now we will invent scenarios that might apply in our school (or department). Do this in groups. Draw up your scenario as a cycle as in this slide. The next slide provides a blank template for this exercise. You might like to photocopy this template for groups to fill in. A sample scenario is given in the following slide.
19
The evidence-driven decision making cycle
EXPLORE QUESTION ASSEMBLE ANALYSE TRIGGER EVALUATE INTERVENE INTERPRET SPECULATE ACT REFLECT This is a blank template for the exercise suggested with the previous slide. You might like to photocopy this template for groups to fill in. A sample scenario is given in the following slide.
20
The evidence-driven decision making cycle
Explore data Survey of students shows that this is only partially true Question What are the characteristics of students who are poor at writing? Assemble more data & other evidence: asTTle reading, homework, extracurric, Attendance, etc Analyse NQF/NCEA results by standard Trigger Some of our students are poor at writing Analyse non NQF/NCEA data and evidence Intervene Create multiple opportunities for writing; include topics that can use sport as context; connect speaking and writing. PD for staff. Interpret information Poor writers likely to play sport, speak well, read less, do little HW A teacher has a hunch - poor writers might spend little time on homework Evaluate Has writing improved? Reflect How will we teach writing in the future? . An sample scenario for the activity in the previous slide. You might prefer to print handout copies.
21
Evaluate and reflect Summative evaluation – assess how successful the intervention was; decide how our practice will change; report to board Formative evaluation - at every stage in the cycle we reflect and evaluate Are we are on the right track? Do we need to fine-tune? Do we actually need to complete this? This is a reminder that could be used at many points in the presentation. It could be useful to flick back to the previous slide when you are discussing reflection at various stages of the process. We should pause here and think about evaluation and reflection. The final stage in this cycle is summative evaluation - we assess how successful the whole process was and reflect on whether we will change our future practice. But at every stage in this cycle we should be reflecting, evaluating in a formative way and making professional judgements about where to go next. We need to be sure that we are on the right track. Should we fine-tune the process as we go? Many schools are consciously developing a ‘culture of inquiry’ – an open and supportive environment in which staff and the school community regularly reflect on the way the school operates, one in which calculated risk-taking is seen as an essential ingredient of innovation. A cyclical improvement process is iterative – incremental changes are incorporated into the knowledge base and into professional practice and feed into the next cycle. There is a compounding effect - change becomes the trigger for more questions. This resource can be used as a contribution to that approach.
22
Types of analysis We can compare achievement data by subject or across subjects for an individual student groups of students whole cohorts The type of analysis we use depends on the question we want to answer This slide simply lists the common types of analysis used in schools. The Consider the Evidence web pages and the full presentation provide examples for each bulleted point. Teachers will be familiar with these three types of data analysis – we introduce them through discussing questions because asking the right questions is a major theme of this resource. The way we analyse data depends on the question we are trying to answer … let’s look at some examples.
23
Formative or summative?
Trigger questions How good/poor is …? What aspects of … are good/poor? Is … actually changing? How is … changing? Is … better than last year? How can … be improved? Why is … good/poor? What targets are reasonable for …? What factors influence the situation for …? What would happen if we …? Formative or summative? You could use this slide without exploring the summative / formative issue. Questions that trigger the process can relate to student achievement or behaviour, teaching approaches and school processes - like the ones on this slide. Questions can be described as summative or formative. Let’s think about the questions on this slide - which of them are summative and which are formative. You could have a discussion about the purpose of summative and formative questions – when is each type of question useful? Summative questions give us end-of-process results, often suitable for reporting and accountability. Formative questions are intended to provide more immediate feedback to improve teaching and learning, so they are probably more specific.
24
Questions about policy
We have been running 60-minute periods for 5-years now. What effect has the change had? Many of our questions will be aimed at evaluating a particular policy or procedure, especially if there has been a change.
25
Questions from hunches
I suspect this poor performance is being caused by … Is this true? We reckon results will improve if we put more effort into ... Is this likely? I think we’d get better results from this module if we added … Is there any evidence to support this idea? The colloquial term hunch is used here to recognise how intuitive teachers can be. The aim is not to belittle hunches. They are extremely useful. In fact most hunches are based on sound professional experience and observation. But until they have been tested against evidence, they remain hunches. In terms of improving particular aspects of teaching and learning, many of the most pertinent questions come from our ‘hunches’. Some of our hunches will be based on a hypothesis or a speculation about a possible change. We’re using the casual term hunch here, but this does not belittle hunches. Teachers, like detectives, base hunches on professional observations – it’s just that you haven’t yet tested them against some evidence.
26
Questions with purpose
What do we know about reported bullying incidents for year 10 students? MAY BE BETTER AS Who has been bullying whom? Where? What are students telling us? What does pastoral care data tell us? Were some interventions more effective with some groups of students than others? Once we get into the process, before we start assembling the evidence we plan to analyse, we need to be sure that we are asking good questions. The initial question on this slide can be answered quite easily but what use will the answer be? More purposeful questions are likely to lead to information we can act on.
27
Professional decision making
We have evidence-based information that we see as reliable and valid What do we do about it? If the information indicates a need for action, we use our collective experience to make a professional decision The aim here is to point out that the analysis will not always point to an obvious intervention. Teachers still need to make professional decisions. Two examples are provided in later slides. Let’s assume that we have carried out the analysis and we have some information we might act on. What we decide to do as result of the information we get from the analysis is guided by our professional experience.
28
Professionals making decisions
Do any particular groups of year 11 students attend less regularly than average for the whole cohort? The analysis identified two groups – so I need to think about how to deal with irregular attendance for each group. How will I do that? An example: The question we asked that led to the analysis was simple enough - and the resulting information identifies where the action should be applied. Exactly what we do is the professional decision.
29
Professionals making decisions
You asked what factors are related to poor student performance in formal writing. The analysis suggested that poor homework habits have a significant impact on student writing. You make some professional judgements and decide Students who do little homework don’t write enough You could take action to improve homework habits - but you’ve tried that before and the success rate is low You have more control over other factors - like how much time you give students to write in class So you conclude - the real need is to get students to write more often Another example: Teachers decided the poor performance in writing was not homework habits but the total amount of writing students do. Professional judgments lead to the conclusion that the action required is to ensure that students do more writing.
30
Deciding on an action Information will often suggest a number of options for action. How do we decide which action to choose? We need to consider what control we have over the action the likely impact of the action the resources needed You might like to return to the slide that shows the full improvement cycle to remind the group which stage we are at. We have analysed some evidence and decided what sort of intervention might improve the initial situation. Remember, what we are going to do is to try out a change, then evaluate it to see if it worked. But first we need to plan the intervention well. These are the factors we need to consider when we are planning an intervention: Control - What aspects of the situation do we have most control over? Do we run a limited pilot rather than a full scale intervention. Impact - What can we do that is most likely to have the desired impact? Do we play it safe and intervene only where we know we can make a major difference? Resources - time, money, people - What will we need? What do we have? What other activities could be affected if we divert resources to this project?
31
Planning for evaluation
What evidence do we need to collect before we start? Do we need to collect evidence along the way, or just at the end? How can we be sure that any assessment at the end of the process will be comparable with assessment at the outset? How will we monitor any unintended effects? Don’t forget evidence such as timetables, student opinions, teacher observations … At the end of the intervention, we will need to evaluate its effect. Did it work? Some questions to consider as we think about the data we will need to evaluate the impact of our intervention.
32
Evaluate the impact of our action
Did the intervention improve the situation that triggered the process? If the aim was to improve student achievement, did that happen? The final stages of the process – we have carried out the intervention and evaluated its impact against baseline data. Now we have to decide how effective the intervention has been. These are the central questions.
33
Evaluate the impact of our action
Was any change in student achievement significant? What else happened that we didn’t expect? How do our results compare with other similar studies we can find? Does the result give us the confidence to make the change permanent? When we evaluate the impact of our intervention, we need to ask the same sort of questions that we asked earlier in the process. That is, we need to interrogate our evaluation. We must not leap to conclusions. The final question in this slide is the crucial one – has the intervention been effective enough to justify embedding the change in normal practice?
34
Future practice What aspects of the intervention will we build into future practice? What aspects of the intervention will have the greatest impact? What aspects of the intervention can we maintain over time? What changes can we build into the way we do things in our school? Would there be any side-effects? Even if things didn’t go exactly as we planned them – even if student achievement wasn’t greatly improved - there are probably some things we have learnt that we should incorporate into future practice. We need to be realistic about what we can achieve – we need to be sure we could maintain the intervention. We should also think about any side-effects – if we put time and effort into this change, will anything else suffer?
35
Future directions What professional learning is needed? Who would most benefit from it? Do we have the expertise we need in-house or do we need external help? What other resources do we need? What disadvantages could there be? When will we evaluate this change again? Before we decide to go ahead and embed aspects of the intervention in our practice, we need to look at all ramifications – and about starting the cycle all over again ….
36
Evidence-driven strategic planning
If we use evidence-driven decision making to improve student achievement and enhance teaching practice … … it follows that strategic planning across the school should also be evidence-driven. Finally, a word about strategic planning. If we use evidence-driven decision making to improve student achievement and enhance teaching practice, it follows that the school’s strategic planning should also be evidence-driven. An example follows on the next slide.
37
Evidence-driven strategic planning
INDICATORS FROM DATA asTTle scores show a high proportion of Yr 9 achieving below curriculum level NCEA results show high non- achievement in transactional writing Poor results in other language NCEA standards etc STRATEGIC GOAL To raise the levels of writing across the school Strategic action Develop a writing development plan which addresses writing across subjects and levels , including targets, professional development and other resourcing needs etc. ANNUAL PLAN Develop and implement a plan to raise levels of Writing at year 9 Development plan to be based on an analysis of all available data and to include a range of shared strategies YEAR TARGET Raise writing asTTle results Yr 9boys from 3B to 3A Appraisal P D Self review School charter EVALUATION DATA asTTle writing results improve by … Perception data from year 9 staff indicates … Evaluation of effectiveness of range of shared strategies, barriers and enablers … . If you choose to use this chart, you might like to print paper copies for distribution. The diagram indicates one way to think of evidence-driven strategic planning. In this model, data from a range of sources provide ‘indicators’ of a problem - one aspect of student achievement (in this case, writing) stands out across the school as one that could be improved. This leads the Board to establish a strategic goal. They then add an appropriate aim and a target (with measurable outcomes) to the school’s annual plan. School leaders would then create a development plan. To do this they will need to go back to the indicator data and analyse these data alongside other data and evidence. Then the development plan is implemented. At the end of the year other data and evidence are analysed to evaluate the success of the development plan as a whole and the various strategies that were used. The data used for evaluation will probably be different from those used to identify the problem and develop the action plan. (In this case, for example, the current NCEA results were not relevant for Year 9 students, and other data was collected to evaluate some of the actions taken.)
38
How can we apply this model in our school?
What now? How can we apply this model in our school? End of presentation. The next step is for the group to discuss how this model can be applied in this school, department or faculty. This should include a discussion about what evidence already exists in the school, how this is collected and recorded, and how well equipped the school is to analyse and use it in the interests of improving student achievement. Finally, the group should consider what evidence-driven projects the school could undertake.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.