Presentation is loading. Please wait.

Presentation is loading. Please wait.

Consider the Evidence What is meant by ‘data and other evidence’?

Similar presentations


Presentation on theme: "Consider the Evidence What is meant by ‘data and other evidence’?"— Presentation transcript:

1 Consider the Evidence What is meant by ‘data and other evidence’?
A resource to assist schools to review their use of data and other evidence 2 What is meant by ‘data and other evidence’?

2 Evidence Any facts, circumstances or perceptions that can be used as an input for an analysis or decision how classes are compiled, how classes are allocated to teachers, test results, teachers’ observations, attendance data, portfolios of work, student opinions … Data are one form of evidence ‘Evidence’ is used here in the same way that it’s used in courts of law and in standards based assessment. Like all schools, we have access to a lot of ‘data’ about student achievement and student behaviour – test results, attendance patterns, etc. But we have access to a lot more information than what is normally thought of as ‘data’. In this session we want to be aware of all the ‘evidence’ we have access to. Some of this evidence is ‘data’ - but some (like student opinions, teachers’ observations) can’t be easily processed in the way we process ‘data’ – so it’s best called ‘evidence’. If participants have concerns about the use of jargon, here’s a way to discuss the issue: Whenever people come to grips with new ideas, they might have to learn new terms or give special meaning to existing words. This happened with curriculum and assessment developments – but most teachers (and parents) are now familiar with terms and concepts like strands, levels and credits. The language of computing is another good example.

3 Data Known facts or measurements, probably expressed in some systematic or symbolic way (e.g. as numbers) assessment results, gender, attendance, ethnicity … Data are one form of evidence This resource treats the word ‘data’ as a plural noun – hence ‘data are …’. There’s nothing new here - but for some of you, this will be quite a narrow definition of data. We could have a discussion about what constitutes data – for example, do all data have to be coded in some way (eg as a number)? I suggest we accept this distinction for the purposes of this session. The main point is: If we want to improve student achievement, we can look at a lot more than what we traditionally think of as data.

4 What evidence does a school have?
Demographics Student achievement Perceptions School processes Other practice This is an introductory slide – the next five slides deal with each category in turn. All schools have data about student achievement. To make the most of these data, we need to take be aware of many other factors - evidence that describes our students’ wider learning environment. We have so much evidence that it’s useful to categorise it in some way. The approach we are taking today separates all data and other evidence into these five categories.

5 Demographics Data that provides a profile of our school
School - decile, roll size, urban/rural, single sex or co-educational, teaching spaces … Students - ethnicity, gender, age, year level, attendance, lateness, suspension and other disciplinary data, previous school, part-time employment … Staff - gender, age, years of experience, qualifications, teaching areas, involvement in national curriculum and assessment, turnover rate … Parents/caregivers and community - socio-economic factors, breadth of school catchment, occupations … Demographics – also known as Profile data – are objective data that describe our school and its students, staff and community – decile, gender, suspensions, etc. All of this influences how we teach and how students learn.

6 Student achievement Evidence about student achievement
National assessment results - NCEA, NZ Scholarship - details like credits above and below year levels, breadth of subjects entered … Standardised assessment results administered internally - PAT, asTTle … Other in-school assessments - most non-standardised but some, especially within departments, will be consistent across classes - includes data from previous schools, primary/intermediate Student work - work completion rates, internal assessment completion patterns, exercise books, notes, drafts of material - these can provide useful supplementary evidence Student Achievement data and other evidence - much of this is readily available – from national assessments, standardised testing we carry out in the school, portfolios of student work, etc.

7 Perceptions Evidence about what students, staff, parents and the community think about the school Self appraisal - student perceptions of their own abilities, potential, achievements, attitudes … Formal and informal observations made by teachers - peer interactions, behaviour, attitudes, engagement, student-teacher relationships, learning styles, classroom dynamics … Structured interactions - records from student interviews, parent interviews, staff conferences on students … Externally generated reports - from ERO and NZQA (these contain data but also perceptions) … Student voice - student surveys, student council submissions … Other informal sources – views about the school environment, staff and student morale, Board perceptions, conversations among teachers … In many schools there will be little of this sort of evidence, so you might spend more time on this. Perceptions - evidence of what staff, students and other think about the school. This can be the most subjective evidence but much of it will be factual and collected in formal ways - student self appraisal, formal and informal observations made by teachers, etc.

8 School processes Evidence about how our school is organised and operates School processes - evidence and data about how your school is organised and operates, including: Timetable –structure, period length, placement of breaks, subjects offered, student choices, tertiary and workforce factors, etc Classes - how they are compiled, their characteristics, effect of timetable choices, etc Resources - access to libraries, text books, ICT, special equipment, etc Finance - how the school budget is allocated, how funds are used within departments, expenditure on professional development Staffing - policies and procedures for employing staff, allocating responsibility, special roles, workload, subjects and classes Some teachers may not think of School Processes as evidence that can be used in decision making – you might need to skip forward to later slides that provide examples. School processes - how our school is organised and operates – the timetable, resources, etc. All of this influences how we teach and how students learn.

9 Other practice How we can find out about what has worked in other schools? Documented research – university and other publications, Ministry of Education’s Best Evidence Syntheses, NZCER, NZARE, overseas equivalents … Experiences of other schools – informal contacts, local clusters, advisory services, TKI LeadSpace … Other Practice – we should look at the experiences of others - documented academic research, the experiences of other schools, etc. It’s important that a school’s evidence-driven decision making benefits from the results of research and the experiences of other schools.

10 Evidence-driven strategic planning
INDICATORS FROM DATA asTTle scores show a high proportion of Yr 9 achieving below curriculum level NCEA results show high non- achievement in transactional writing Poor results in other language NCEA standards etc STRATEGIC GOAL To raise the levels of writing across the school Strategic action Develop a writing development plan which addresses writing across subjects and levels , including targets, professional development and other resourcing needs ANNUAL PLAN Develop and implement a plan to raise levels of Writing at Year 9 Development plan to be based on an analysis of all available data and to include a range of shared strategies YEAR TARGET Raise writing asTTle results Yr 9 boys from 3B to 3A Appraisal P D Self review School charter EVALUATION DATA asTTle writing results improve by … Perception data from Yr 9 staff indicates … Evaluation of effectiveness of range of shared strategies, barriers and enablers … . If you choose to use this chart, you might like to print paper copies for distribution. Think carefully about how you deal with this slide – this could be overkill if the intent of your session is to see how you can change teaching practice to improve student learning. If we use evidence-driven decision making to improve student achievement and enhance teaching practice, it follows that the school’s strategic planning should also be evidence-driven. The diagram indicates one way to think of evidence-driven strategic planning. In this model, data from a range of sources provide ‘indicators’ of a problem - one aspect of student achievement (in this case, writing) stands out across the school as one that could be improved. This leads the Board to establish a strategic goal. They then add an appropriate aim and a target (with measurable outcomes) to the school’s annual plan. School leaders would then create a development plan. To do this they will need to go back to the indicator data and analyse these data alongside other data and evidence. Then the development plan is implemented. At the end of the year other data and evidence are analysed to evaluate the success of the development plan as a whole and the various strategies that were used. The data used for evaluation will probably be different from those used to identify the problem and develop the action plan. (In this case, for example, the current NCEA results were not relevant for Year 9 students, and other data was collected to evaluate some of the actions taken.)

11 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide lists the stages in the cycle and gives simple explanations. The next two slides show these stages in cyclic diagrams. You might prefer to print handout copies of the text below or the cycle diagram in a later slide. It’s useful to think of the cycle as having these sequential stages: Trigger Data, ideas, hunches, etc set a process in action. The trigger is whatever it is that makes you think there could be an opportunity to improve student achievement. You can routinely scan available data looking for inconsistencies, etc. It can be useful to speculate about possible causes or effects - and then explore data and other evidence to see if there are any grounds for the speculation. Explore Initial data, ideas or hunches usually need some preliminary exploration to pinpoint the issue and suggest good questions to ask. Question This is the key point: what question/s do you want answered. Questions can raise an issue and/or propose a possible solution. Assemble Get together all the data and evidence you might need - some will already exist and some will have to be generated for the occasion. Analyse Process sets of data and relate them to other evidence. You are looking for trends and results that will answer your questions (but watch out for unexpected results that might suggest a new question). Interpret Think about the results of the analysis and clarify the knowledge and insights you think you have gained. Interrogate the information. It’s important to look at the information critically. Was the data valid and reliable enough to lead you to firm conclusions? Do the results really mean what they seems to mean? How sure are you about the outcome? What aspects of the information lead to possible action? Intervene Design and implement a plan of action designed to change the situation you started with. Be sure that your actions are manageable and look at the resourcing needed. Consider how you’ll know what has been achieved. Evaluate Using measures you decided in advance, assess how successful the intervention has been. Has the situation that triggered the process been improved? What else happened that you maybe didn’t expect? Reflect Think about what has been learned and discovered. What did we do that worked? Did this process suggest anything that we need to investigate further? What aspects of the intervention can be maintained? What changes will we make to our practices? What support will we need?

12 The evidence-driven decision making cycle
EXPLORE Check data and evidence to explore the issue QUESTION Clarify the issue and ask a question ASSEMBLE Decide what data and evidence might be useful ANALYSE data and evidence TRIGGER Data indicate a possible issue that could impact on student achievement EVALUATE the impact on the intervention INTERVENE Plan action to improve student achievement INTERPRET Insights that answer your question SPECULATE A teacher has a hunch about a problem or a possible action ACT Carry out the intervention REFLECT on what has been learned, how practice will change Here are those nine stages in cyclic form. The length of the cycle will vary for different situations. We might wait a year to evaluate the effects of our actions - but sometimes we’ll be able to (and ought to) work to shorter (or maybe longer) cycles.

13 The evidence-driven decision making cycle
Explore data Survey of students shows that this is only partially true Question What are the characteristics of students who are poor at writing? Assemble more data & other evidence: asTTle reading, homework, extracurric, Attendance, etc Analyse NQF/NCEA results by standard Trigger Some of our students are poor at writing Analyse non NQF/NCEA data and evidence Intervene Create multiple opportunities for writing; include topics that can use sport as context; connect speaking and writing. PD for staff. Interpret information Poor writers likely to play sport, speak well, read less, do little HW A teacher has a hunch - poor writers might spend little time on homework Evaluate Has writing improved? Reflect How will we teach writing in the future? An sample scenario for the activity in the previous slide. You might prefer to print handout copies. Or you could leave it out altogether.

14 Types of analysis We can compare achievement data by subject or across subjects for an individual student groups of students whole cohorts The type of analysis we use depends on the question we want to answer The next three slides explain three of the most common types of analysis used in schools: Inter subject analysis Intra subject analysis Longitudinal analysis The aim here is to make a start on thinking about what you can do with data. It’s not a theory lesson. Teachers will be familiar with these three types of data analysis – the next three slides introduce them through discussing questions because asking the right questions is a major theme of this resource. You might choose to omit the next three slides. These examples use only student achievement data in order to highlight the different approaches. The way we analyse data depends on the question we are trying to answer … let’s look at three examples.

15 Inter-subject analysis
Have my students not achieved a particular history standard because they have poor formal writing skills, rather than poor history knowledge? We can explore this question this by cross referencing the history results of individual students with their formal writing results in English. If the trend is for your students to do less well in formal writing than in other aspects of English and/or other aspect of history, the answer could be Yes.

16 Intra-subject analysis
What are the areas of strength and weakness in my own teaching of this class? We could compare the externally assessed NCEA results of our students with results from appropriately matched schools using published national data. Where these differences are greater or less than average, areas of strength/weakness may exist. For example, we might find that, on average, your students have gained credits at a rate 5 percentage points better than the comparison schools. But in one standard the difference is 15 points, indicating a possible area of strength. In another standard, there is zero difference, indicating a possible area of weakness. This may be a good time to discuss NCEA data. It has been described as ‘both rich and subtle’’. Your school’s NCEA data gives you access to a huge amount of fine-grained information. You can also aggregate NCEA data to show trends, etc – but you need to be careful that in aggregating the data you don’t lose the subtlety and even produce misleading information.

17 Longitudinal analysis
Are we producing better results over time in year 11 biology? We can compare NCEA Biology results for successive year 11 cohorts at this school with the national cohorts for successive years. But the results for the national cohorts might be improving too. So we need to work out how the school’s cohort differs from the national cohort in each year. If the school’s rate of change is better than the changes for the national cohort, the answer could be Yes. To be sure that our teaching is producing this improvement, we should look at the overall levels of achievement of each cohort – otherwise, any improvement could be a result of more able students, rather than better teaching. You could discuss longitudinal analysis of student performance. That would require comparable assessments over successive years – NCEA results at successive levels probably would not suffice for this.

18 The evidence-driven decision making cycle
> Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide will reappear throughout (with each step in > bold) to identify what stage of the cycle you are at. Let’s start now at the beginning of the cycle – what triggers the process?

19 Asking questions Evidence-driven decision making
starts with asking good questions You can tell whether a man is clever by his answers. You can tell whether he is wise by his questions. Nobel Prize winner, Naguib Mahfouz Questions are the major theme of this resource. There are probably too many slides here for you to use, but it’s worth spending some time exploring how to get your question right before you start on any data analysis. Before we start any analysis, we need to write down the questions we want to answer. The quality of any analysis depends on the quality of the question we want answered. So it’s worth spending some time thinking about getting our questions right. The questions we ask will determine the selection of evidence and how we analysis it. Where will our questions come from?

20 Trigger questions How good/poor is …? What aspects of … are good/poor?
Is … actually changing? How is … changing? Is … better than last year? How can … be improved? Why is … good/poor? What targets are reasonable for …? What factors influence the situation for …? What would happen if we …? Trigger questions can relate to student achievement or behaviour, teaching approaches and school processes - like the ones on this slide.

21 Questions from hunches
I suspect this poor performance is being caused by … Is this true? We reckon results will improve if we put more effort into ... Is this likely? I think we’d get better results from this module if we added … Is there any evidence to support this idea? The colloquial term hunch is used here to recognise how intuitive teachers can be. The aim is not to belittle hunches. They are extremely useful. In fact most hunches are based on sound professional experience and observation. But until they have been tested against evidence, they remain hunches. In terms of improving particular aspects of teaching and learning, many of the most pertinent questions come from our ‘hunches’. Some of our hunches will be based on a hypothesis or a speculation about a possible change. We’re using the casual term hunch here, but this does not belittle hunches. Teachers, like detectives, base hunches on professional observations – it’s just that you haven’t yet tested them against some evidence.

22 The evidence-driven decision making cycle
Trigger Clues found in data, hunches > Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? Before we rush into analysing evidence based on a hunch, we need to pause and explore – we should ask ourselves: Is there really an issue here? Do we really know what the issue is?

23 Question – Explore – Question
It looks like our students are doing well in A but not in B. What can we do about it? EXPLORE … what else should we be asking? Is this actually the case? Is there anything in the data to suggest what we could do about it? Our initial question may be general and tentative – often based on a quick look at data. Additional evidence and professional judgement may be needed to be sure that we are onto something. This should lead us to new questions. As we start to explore data and other evidence new questions will arise. In this case, we should do some preliminary analysis of the data to be sure that our impression is justified – then we can look again at the data for possible solutions. Or we can base our actions on as hunch (as in the previous slide).

24 Question – Explore – Question
We have been running 60-minute periods for a year now. Did the change achieve the desired effects? EXPLORE … what else should we be asking? How has the change impacted on student achievement? Has the change has had other effects? Is there more truancy? Is more time being spent in class on assignments, rather than as homework? Here’s another example of how we should explore questions before we start to look for evidence. The assumption here is that the change to 60 minute periods was intended to have specific results .

25 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? > Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide reappears (with each step in > bold) to identify what stage of the cycle you are at. Next – we need to pin down the question we want an answer to.

26 A very good question Specific and with a clear purpose
Able to be investigated through looking at data and other evidence Likely to lead to information on which we can act A very good question in evidence-driven decision making is one that will ultimately help us to make a good decision.

27 Questions with purpose
What do we know about reported bullying incidents for year 10 students? MAY BE BETTER AS Who has been bullying whom? Where? What are students telling us? What does pastoral care data tell us? Were some interventions more effective with some groups of students than others? The initial question can be answered quite easily but what use will the answer be? More purposeful questions are likely to lead to information we can act on.

28 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? > Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide reappears (with each step in > bold) to identify what stage of the cycle you are at. Next – we need to assemble data and other evidence

29 Assembling the evidence
We want to know if our senior students are doing better in one area of NCEA biology than another. So … we need NCEA results for our cohort. It could be that all biology students do better in this area than others. So … we also need data about national differences across the two areas. Once we have a very good question, we need to consider what data and other evidence will help us answer it. Often the data we need will be obvious. But we need to make sure that we have looked at all angles, to ensure we have all the data we need to draw valid conclusions.

30 Think critically about data
Was the assessment that created this data assessing exactly what we are looking for? Was the assessment set at an appropriate level for this group of students? Was the assessment properly administered? Are we comparing data for matched groups? These questions are really about validity and reliability. We can make commonsense judgements about whether data is valid for our purposes and whether it was created under conditions we can rely on.

31 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together > Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide reappears (with each step in > bold) to identify what stage of the cycle you are at. Next – we analyse the evidence.

32 Analysing data and other evidence
Schools need some staff members who are responsible for leading data analysis Schools have access to electronic tools to process data into graphs and tables All teachers do data analysis Data is not an end in itself - it’s one of the many stages along the way to evidence-driven decision making The intent here is to stress that data analysis is just one step in a process (not even the most crucial one) and that every teacher does out some data analysis. We all have access to tools that will generate, for example, gender and ethnicity comparisons, comparing intra subject analyses, comparisons with national results.

33 Making sense of the results
Think about significance and confidence How significant are any apparent trends? How much confidence can we have in the information? Even if we don’t fully understand how the analysis was carried out, we should still think critically about the results. Ask the person who did the analysis what has been done, what they think the results show, what limitations they would place on the results. Significance and confidence are key issues.

34 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence > Interpret What information do we have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide reappears (with each step in > bold) to identify what stage of the cycle you are at. Next – we need to think about the information we get from analysis.

35 Information Knowledge gained from analysing data and making meaning from evidence. Information is knowledge (or understanding) that can inform your decisions. How certain you will be about this knowledge depends on a number of factors: where your data came from, how reliable it was, how rigorous your analysis was. So the information you get from analysing data could be a conclusion, a trend, a possibility. This is a slide from the Terminology section of the full resource. You need to decide how much it’s worth discussing this. Let’s think about what counts as information.

36 Words, words, words … Information can … establish, indicate, confirm, reinforce, back up, stress, highlight, state, imply, suggest, hint at, cast doubt on, refute … Does this confirm that …? What does this suggest? What are the implications of …? How confident are we about this conclusion? It’s also worth thinking about precisely how we express information, especially in relation to the question that triggered the analysis. The verbs we choose reflect the confidence we have in the information. You can even see a hierarchy. When we interrogate the information we use even more questions. The answers will often come from our professional judgement.

37 Interrogate the information
Is this the sort of result we envisaged? If not, why? How does this information compare with the results of other research or the experiences of other schools? Are there other variables that could account for this result? Should we set this information alongside other data or evidence to give us richer information? What new questions arise from this information? We need to interrogate the information generated by analysing data and evidence. We decide how confident we are about the result by asking questions like this.

38 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? > Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide reappears (with each step in > bold) to identify what stage of the cycle you are at. Next – we use the information we have from the analysis to change our practice.

39 Professional decision making
We have evidence-based information that we see as reliable and valid What do we do about it? If the information indicates a need for action, we use our collective experience to make a professional decision What we decide to do as result of the information we get from the analysis is guided by our professional experience.

40 Deciding on an action Information will often suggest a number of options for action. How do we decide which action to choose? We need to consider what control we have over the action the likely impact of the action the resources needed These are the factors we need to consider when we are planning an intervention: Control - What aspects of the situation do we have most control over? Do we run a limited pilot rather than a full scale intervention. Impact - What can we do that is most likely to have the desired impact? Do we play it safe and intervene only where we know we can make a major difference? Resources - time, money, people - What will we need? What do we have? What other activities could be affected if we divert resources to this project? NOTE – This resource does not provide advice on writing an action plan – see the website pages or the Full Presentation for resources on writing action plans.

41 Planning for evaluation
What evidence do we need to collect before we start? Do we need to collect evidence along the way, or just at the end? How can we be sure that any assessment at the end of the process will be comparable with assessment at the outset? How will we monitor any unintended effects? Don’t forget evidence such as timetables, student opinions, teacher observations … We need to decide in advance how we will evaluate the impact. To do this we will need valid and reliable baseline evidence. The baseline evidence we need for evaluation might be different from the data we analysed earlier in this project. Here are some questions to consider as we think about the data we will need to evaluate the impact of our intervention.

42 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action > Evaluate What was the impact? Reflect What will we change? This slide reappears (with each step in > bold) to identify what stage of the cycle you are at. Next – after we have changed our practice, we evaluate the impact of the change.

43 Evaluate the impact of our action
Did the intervention improve the situation that triggered the process? If the aim was to improve student achievement, did that happen? We now leap to the final stages of the processes – we have carried out the intervention and evaluated its impact against baseline data. Now we have to decide how effective the intervention has been. These are the central questions.

44 Evaluate the impact of our action
Was any change in student achievement significant? What else happened that we didn’t expect? How do our results compare with other similar studies we can find? Does the result give us the confidence to make the change permanent? When we evaluate the impact of our intervention, we need to ask the same sort of questions that we asked earlier in the process. That is, we need to interrogate our evaluation. We must not leap to conclusions. The final question is the crucial one – has the intervention been effective enough to justify incorporating the change into our normal practice?

45 The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? > Reflect What will we change? This slide reappears (with each step in > bold) to identify what stage of the cycle you are at. Finally – we think about how we will change our future practice.

46 Future practice What aspects of the intervention will we embed in future practice? What aspects of the intervention will have the greatest impact? What aspects of the intervention can we maintain over time? What changes can we build into the way we do things in our school? Would there be any side-effects? Even if things didn’t go exactly as we planned them – even if student achievement wasn’t greatly improved - there are probably some things we have learnt that we should incorporate into future practice. We need to be realistic about what we can achieve – we need to be sure we could maintain the intervention. We should also think about any side-effects – if we put time and effort into this change, will anything else suffer?

47 Future directions What professional learning is needed? Who would most benefit from it? Do we have the expertise we need in-house or do we need external help? What other resources do we need? What disadvantages could there be? When will we evaluate this change again? Before we decide to go ahead and embed aspects of the intervention into our practice, we need to look at all ramifications – and about starting the cycle all over again ….

48 What now? How can we apply this model in our school?
End of presentation. The next step is for the group to discuss how this model can be applied in this school, department or faculty. This should include a discussion about what evidence already exists in the school, how this is collected and recorded, and how well equipped the school is to analyse and use it in the interests of improving student achievement. Finally, the group should consider what evidence-driven projects the school could undertake.


Download ppt "Consider the Evidence What is meant by ‘data and other evidence’?"

Similar presentations


Ads by Google