Download presentation
Presentation is loading. Please wait.
Published byJade Hoover Modified over 6 years ago
1
Consider the Evidence Evidence-driven decision making
for secondary schools A resource to assist schools to review their use of data and other evidence 5 Getting Started
2
Evidence-driven decision making
Today we will think about the data and other evidence we can use to improve teaching, learning and student achievement how to get started on a decision-making process by asking purposeful questions This session will get us started on analysing ‘data and other evidence’ to make decisions – we will see how to do this in a structured and informed way. In a moment we’ll discuss what is meant by ‘data and other evidence’. We can apply today’s material to student achievement at all secondary levels - not just senior – and to all curriculum learning areas and school processes. First, we need to look at some of the terms being used here – so we are all speaking the same language. First: What is evidence-driven decision making? This resource does not justify the use of data analysis in schools – the assumption is that teachers and schools already understand the benefits and to some extent already do it. This resource does not provide data processing tools or directions on how to analyse data. These are readily available. After these sessions, the group to discuss how this model can be applied in this school, department or faculty. This should include a discussion about what evidence already exists in the school, how this is collected and recorded, and how well equipped the school is to analyse and use it in the interests of improving student achievement. Finally, the group should consider what evidence-driven projects the school could undertake. Beware of collecting data and other evidence that you might not need. If the school thinks ahead about how to make evidence-based decisions, you will know what data and other evidence you should collect.
3
The evidence-driven decision making cycle
Explore data Survey of students shows that this is only partially true Question What are the characteristics of students who are poor at writing? Assemble more data & other evidence: asTTle reading, homework, extracurric, Attendance, etc Analyse NQF/NCEA results by standard Trigger Significant numbers not achieving well in writing Analyse non NQF/NCEA data and evidence Intervene Create multiple opportunities for writing; include topics that can use sport as context; connect speaking and writing. PD for staff. Interpret information Poor writers likely to play sport, speak well, read less, do little HW A teacher has a hunch - poor writers might spend little time on homework Evaluate Has writing improved? Reflect How will we teach writing in the future? . This is the evidence-based decision making model outlined in the resource. It looks like a standard improvement cycle, but the approaches recommended within the separate stages make it especially useful for use in schools.
4
Evidence-driven decision making Getting started
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide lists the stages in the cycle and gives simple explanations. You might prefer to print handout copies of the text below or the cycle diagram in the previous slide. The cycle has these sequential stages – today we are going to look at the first three stages: Trigger Data, ideas, hunches, etc set a process in action. The trigger is whatever it is that makes you think there could be an opportunity to improve student achievement. You can routinely scan available data looking for inconsistencies, etc. It can be useful to speculate about possible causes or effects - and then explore data and other evidence to see if there are any grounds for the speculation. Explore Initial data, ideas or hunches usually need some preliminary exploration to pinpoint the issue and suggest good questions to ask. Question This is the key point: what question/s do you want answered. Questions can raise an issue and/or propose a possible solution.
5
Evidence-driven eating
You need to buy lunch. Before you decide what to buy you consider a number of factors: how much money do you have? what do you feel like eating? what will you be having for dinner? how far do you need to go to buy food? how much time do you have? where are you going to eat it? This is a frivolous, non-educational scenario to start you thinking about how schools use evidence to make decisions. Don’t labour it. With many groups (especially experienced teachers) it will be inappropriate to use it at all. This scenario is just to get us started. There’s nothing mysterious about evidence-driven decision making. We all make decisions every day based on an analysis of a number of factors. In this scenario you’d analyse the factors and make a decision in seconds (or you’d go hungry). What other factors you might consider before buying lunch? For example: Who are you eating with? How much do you want to spend? What did you have for breakfast? How hungry are you? Are you on a special diet? What else do you need to do this lunchtime? Who do you want to avoid this lunchtime?
6
Evidence-driven teaching
I had a hunch that Ana wasn’t doing as well as she could in her research assignments, a major part of the history course. What made me think this? Ana’s general work (especially her writing) was fine. She made perceptive comments in class, contributed well in groups and had good results overall last year, especially in English. How did I decide what to do about it? The aim here is to demonstrate (and acknowledge) that teachers look for and use a variety of evidence as a normal part of effective and reflective teaching. The conclusions the teacher reaches are not as important here as the investigative approach he uses . Teachers continually consider what they know about students. This story told by Ana’s history teacher is typical. It’s not a story about a formal investigation. It’s just the sort of thing good teachers do all the time. It might have taken place over just a week or two and taken very little of the teacher’s time. This teacher had a ‘hunch’ based on his general professional observations. He informally compared a range of evidence to see if his hunch was correct. It was. He wanted to find a way to improve this one aspect of Ana’s achievement. He considered other evidence and analysed it. This enabled him to pinpoint the problem and plan a course of action designed to improve Ana’s achievement. This teacher was thinking about the data and other evidence he had right there in front of him - and then he acted on his conclusions. The teacher used evidence-driven decision making, using data and other evidence to inform his actions. In this session we want to see how to expand (and systematise) that sort of thinking to drive improvement across the school.
7
Evidence-driven teaching (cont…)
I looked more closely at her other work. I watched her working in the library one day to see if it was her reading, her use of resources, her note taking, her planning, or what. At morning tea I asked one of Ana’s other teachers about Ana’s approach to similar tasks. I asked Ana if she knew why her research results weren’t as good as her other results, and what her plans were for the next assignment. I thought about all of this and planned a course of action. I gave her help with using indexes, searching, note taking and planning and linking the various stages of her research. The aim here is to demonstrate (and acknowledge) that teachers look for and use a variety of evidence as a normal part of effective and reflective teaching. The conclusions the teacher reaches are not as important here as the investigative approach he uses . Teachers continually consider what they know about students. This story told by Ana’s history teacher is typical. It’s not a story about a formal investigation. It’s just the sort of thing good teachers do all the time. It might have taken place over just a week or two and taken very little of the teacher’s time. This teacher had a ‘hunch’ based on his general professional observations. He informally compared a range of evidence to see if his hunch was correct. It was. He wanted to find a way to improve this one aspect of Ana’s achievement. He considered other evidence and analysed it. This enabled him to pinpoint the problem and plan a course of action designed to improve Ana’s achievement. This teacher was thinking about the data and other evidence he had right there in front of him - and then he acted on his conclusions. The teacher used evidence-driven decision making, using data and other evidence to inform his actions. In this session we want to see how to expand (and systematise) that sort of thinking to drive improvement across the school.
8
What is meant by ‘data and other evidence’?
Consider the Evidence A resource to assist schools to review their use of data and other evidence What is meant by ‘data and other evidence’? This is an introductory slide – the next two slides discuss and define ‘evidence’ and ‘data’. This resource uses terms commonly used in schools - the usage and meaning intended is that generally applied in educational circles. A full discussion of terminology used in this resource is in the Appendix. The resource we are using is called ‘Consider the Evidence’ – it deals with ‘evidence-driven decision making’. So we need to think about what constitutes ‘evidence’ in schools. It’s useful to think in terms of ‘data and other evidence’.
9
Evidence Any facts, circumstances or perceptions that can be used as an input for an analysis or decision how classes are compiled, how classes are allocated to teachers, test results, teachers’ observations, attendance data, portfolios of work, student opinions … Data are one form of evidence ‘Evidence’ is used here in the same way that it’s used in courts of law and in standards based assessment. Like all schools, we have access to a lot of ‘data’ about student achievement and student behaviour – test results, attendance patterns, etc. But we have access to a lot more information than what is normally thought of as ‘data’. In this session we want to be aware of all the ‘evidence’ we have access to. Some of this evidence is ‘data’ - but some (like student opinions, teachers’ observations) can’t be easily processed in the way we process ‘data’ – so it’s best called ‘evidence’. If participants have concerns about the use of jargon, here’s a way to discuss the issue: Whenever people come to grips with new ideas, they might have to learn new terms or give special meaning to existing words. This happened with curriculum and assessment developments – but most teachers (and parents) are now familiar with terms and concepts like strands, levels and credits. The language of computing is another good example.
10
Data Known facts or measurements, probably expressed in some systematic or symbolic way (e.g. as numbers) assessment results, gender, attendance, ethnicity … Data are one form of evidence This resource treats the word ‘data’ as a plural noun – hence ‘data are …’. There’s nothing new here - but for some of you, this will be quite a narrow definition of data. We could have a discussion about what constitutes data – for example, do all data have to be coded in some way (eg as a number)? But I suggest you simply accept this distinction for the purposes of this session. We can discuss other terminology later. The main point is: If we want to improve student achievement, we can look at a lot more than what we traditionally think of as data.
11
Which factors are data? Evidence to consider before buying lunch
how much money you have what you feel like eating what you’ll be having for dinner how far you need to go to buy food how much time you have where you’re going to eat what your diet allows If you used the lunch decision scenario (slide 2) it’s worth thinking about what constitutes data. You should also consider any other factors you added earlier. Looking back at the lunch decision scenario – in terms of how we’ve defined ‘data’ and ‘other evidence’, which of these factors would we categorise as ‘data’?
12
Evidence-driven decision making
We have more evidence about what students know and can do than ever before – their achievements, behaviours, environmental factors that influence learning We should draw on all our knowledge about the learning environment to improve student achievement explore what lies behind patterns of achievement decide what changes will make a difference This is a general introduction to evidence-driven decision making. It may not be necessary for groups that are already committed to this sort of approach. But groups intent on change should pause to consider how to minimise effort and risk. We all know that we can make significant improvements to teaching and learning by analysing data and applying what we learn from it. But an issue for many schools is that they have too much data in some areas – too much evidence – and not enough time and resources to use it all effectively. In other areas, we have too little evidence. Any change in teaching practice is a risk - you can never be entirely sure of the consequences. The approach we are looking at today shows us how to decide what we should change to improve student achievement. Evidence-driven decision making can help by reducing or assessing risk, and maybe by pointing out changes that have lesser risk.
13
What evidence does a school have?
Demographics Student achievement Perceptions School processes Other practice This is an introductory slide – the next five slides deal with each category in turn. All schools have data about student achievement. To make the most of these data, we need to take be aware of many other factors - evidence that describes our students’ wider learning environment. We have so much evidence that it’s useful to categorise it in some way. The approach we are taking today separates all data and other evidence into these five categories. If you have already done an exercise that uses data, you could use these categories to discuss what sort of evidence you used in the exercise – and what other evidence could be used to extend the example.
14
Demographics What data do we have now to provide a profile of our school? What other data could we create? School Students Staff Parents/caregivers and community Demographics – also known as Profile data – objective data that describe our school and its students, staff and community – decile, gender, suspensions, etc The next slide provides examples for each bullet point – but first you could do this exercise: Divide into four groups. Take one category each and list our school’s existing data. Then list some other potential sources of data and evidence. That is, how we could collect or generate evidence that our school doesn’t have? You should make the point that data and other evidence should be generated only if it’s for a purpose.
15
Demographics Data that provides a profile of our school
School - decile, roll size, urban/rural, single sex or co-educational, teaching spaces … Students - ethnicity, gender, age, year level, attendance, lateness, suspension and other disciplinary data, previous school, part-time employment … Staff - gender, age, years of experience, qualifications, teaching areas, involvement in national curriculum and assessment, turnover rate … Parents/caregivers and community - socio-economic factors, breadth of school catchment, occupations … A suggested list for each bullet point in the previous slide.
16
Student achievement What evidence do we have now about student achievement? What other evidence could we collect? National assessment results Standardised assessment results administered internally Other in-school assessments Student work Student Achievement data and other evidence - much of this is readily available – from national assessments, standardised testing we carry out in the school, portfolios of student work, etc. The next slide provides examples for each bullet point – but first you could do this exercise: Divide into four groups. Take one category each and list our school’s existing data. Then list some other potential sources of data and evidence. That is, how we could collect or generate evidence that our school doesn’t have? The next slide provides a suggested list.
17
Student achievement Evidence about student achievement
National assessment results - NCEA, NZ Scholarship - details like credits above and below year levels, breadth of subjects entered… Standardised assessment results administered internally - PAT, asTTle … Other in-school assessments - most non-standardised but some, especially within departments, will be consistent across classes - includes data from previous schools, primary/intermediate Student work - work completion rates, internal assessment completion patterns, exercise books, notes, drafts of material - these can provide useful supplementary evidence A suggested list for each bullet point in the previous slide.
18
Perceptions What evidence do we have now about what students, staff and others think about the school? Are there other potential sources? Self appraisal Formal and informal observations made by teachers Structured interactions Externally generated reports Student voice Other informal sources In many schools there will be little of this sort of evidence, so you might spend more time on this. Perceptions - evidence of what staff, students and other think about the school - probably the most subjective evidence but much of it will be factual and collected in formal ways - student self appraisal, formal and informal observations made by teachers, etc. Divide into four groups. Take one category each and list our school’s existing data. Then list some other potential sources of data and evidence. That is, how we could collect or generate evidence that our school doesn’t have? The next slide provides a suggested list.
19
Perceptions Evidence about what students, staff, parents and the community think about the school Self appraisal - student perceptions of their own abilities, potential, achievements, attitudes … Formal and informal observations made by teachers - peer interactions, behaviour, attitudes, engagement, student-teacher relationships, learning styles, classroom dynamics … Structured interactions - records from student interviews, parent interviews, staff conferences on students … Externally generated reports - from ERO and NZQA (these contain data but also perceptions) … Student voice - student surveys, student council submissions… Other informal sources – views about the school environment, staff and student morale, Board perceptions, conversations among teachers … A suggested list for each bullet point in the previous slide.
20
School processes What evidence do we have about how our school is organised and operates? Timetable Classes Resources Finance Staffing Some teachers may not think of School Processes as evidence that can be used in decision making – you might need to skip forward to later slides that provide examples. School processes - how our school is organised and operates – the timetable, resources, etc Divide into four groups. Take one category each and list our school’s existing data. Then list some other potential sources of data and evidence. That is, how we could collect or generate evidence that our school doesn’t have? The next slide provides a suggested list.
21
School processes Evidence about how our school is organised and operates School processes - evidence and data about how your school is organised and operates, including: Timetable –structure, period length, placement of breaks, subjects offered, student choices, tertiary and workforce factors, etc Classes - how they are compiled, their characteristics, effect of timetable choices, etc Resources - access to libraries, text books, ICT, special equipment, etc Finance - how the school budget is allocated, how funds are used within departments, expenditure on professional development Staffing - policies and procedures for employing staff, allocating responsibility, special roles, workload, subjects and classes A suggested list for each bullet point in the previous slide.
22
Other practice How can we find out about what has worked (or not) in other schools? It’s important that a school’s evidence-driven decision making benefits from the results of research and the experiences of other schools. Other Practice – we should look at the experiences of others - documented academic research, the experiences of other schools, etc. How can we find out what other schools have done? Discuss sources of research evidence, how you can find out what other schools have done, etc. The next slide provides a suggested list.
23
Other practice How we can find out about what has worked in other schools? Documented research – university and other publications, Ministry of Education’s Best Evidence Syntheses, NZCER, NZARE, overseas equivalents … Experiences of other schools – informal contacts, local clusters, advisory services, TKI LeadSpace … A suggested list from the previous slide.
24
What can we do with evidence?
Shane’s story A history HOD wants to see whether history students are performing to their potential. She prints the latest internally assessed NCEA records for history students across all of their subjects. As a group, history students seem to be doing as well in history as they are in other subjects. Then she notices that Shane is doing very well in English and only reasonably well in history. She wonders why, especially as both are language-rich subjects with many similarities. The HOD speaks with the history teacher, who says Shane is attentive, catches on quickly and usually does all work required. He mentions that Shane is regularly late for class, especially on Monday and Thursday. So he often misses important information or takes time to settle in. He has heard there are ‘problems at home’ so has overlooked it, especially as the student is doing reasonably well in history. If you choose to use this scenario, you might like to print paper copies for distribution. It is shown here on two slides. You could replace Shane’s story with a scenario from your own school. Teachers are engaging every day in decision making that considers data and other evidence. This is a typical scenario - a teacher notices something interesting in a student’s achievement data and wonders if there is an explanation. Continued on next page of notes.
25
Shane’s story (cont...) The HOD looks at the timetable and discovers that history is period 1 on Monday and Thursday. She speaks to Shane’s form teacher who says that she suspects Shane is actually late to school virtually every day. They look at centralised records. Shane has excellent attendance but frequent lateness to period 1 classes. The HOD speaks to the dean who explains that Shane has to take his younger sister to school each morning. He had raised the issue with Shane but he said this was helping the household get over a difficult period and claimed he could handle it. The staff involved agree that Shane’s regular lateness is having a demonstrable impact on his achievement, probably beyond history but not so obviously. The dean undertakes to speak to the student, history teacher, and possibly the parents to find a remedy for the situation. Continued from previous page of notes. The next slide suggests that you consider the key factors in this scenario.
26
Thinking about Shane’s story
What were the key factors in the scenario about Shane? What types of data and other evidence were used? What questions did the HOD ask? What happened in this case that wouldn’t necessarily happen in some schools? The main aim here is to get the group thinking about the categories of evidence discussed earlier (slide 10): Demographics Student Achievement Perceptions School Processes Other Practice Key factors are suggested in the next slide.
27
Shane’s story - keys to success
The history HOD looked at achievement data in English and history. She looked for something significant across the two data sets, not just low achievement. Then she asked a simple question: Why is there such a disparity between in these two subjects for that student? She sought information and comments (perceptions evidence and data) from all relevant staff. The school had centralised attendance and punctuality records (demographic data) that form teacher could access easily. The action was based on all available evidence and designed to achieve a clear aim. A suggested list from the previous slide.
28
The evidence-driven decision making cycle
Explore Check data and evidence to explore the issue Question Clarify the issue and ask a question Assemble Decide what data and evidence might be useful Analyse data and evidence Trigger Data indicate a possible issue that could impact on student achievement Evaluate the impact on the intervention Intervene Plan an action aimed at improving student achievement Interpret Insights that answer your question Speculate A teacher has a hunch about a problem or a possible action Act Carry out the intervention Reflect on what has been learned, how practice will change The length of the cycle will vary for different situations. We might wait a year to evaluate the effects of our actions - but sometimes we’ll be able to (and ought to) work to shorter (or maybe longer) cycles. It’s important that we reflect, evaluate and make professional judgements at each stage of this cycle. Now we will invent scenarios that might apply in our school (or department). Do this in groups. Draw up your scenario as a cycle as in this slide. The next slide provides a blank template for this exercise. You might like to photocopy this template for groups to fill in. A sample scenario is given in the following slide.
29
The evidence-driven decision making cycle
EXPLORE QUESTION ASSEMBLE ANALYSE TRIGGER EVALUATE INTERVENE INTERPRET SPECULATE ACT REFLECT This is a blank template for the exercise suggested with the previous slide. You might like to photocopy this template for groups to fill in. A sample scenario is given in the following slide.
30
Evaluate and reflect Summative evaluation – assess how successful the intervention was; decide how our practice will change; report to board Formative evaluation – at every stage in the cycle we reflect and evaluate Are we are on the right track? Do we need to fine-tune? Do we actually need to complete this? This is a reminder that could be used at many points in the presentation. We should pause here and think about evaluation and reflection. The final stage in this cycle is summative evaluation - we assess how successful the whole process was and reflect on whether we will change our future practice. But at every stage in this cycle we should be reflecting, evaluating in a formative way and making professional judgements about where to go next. We need to be sure that we are on the right track. Should we fine-tune the process as we go? Many schools are consciously developing a ‘culture of inquiry’ – an open and supportive environment in which staff and the school community regularly reflect on the way the school operates, one in which calculated risk-taking is seen as an essential ingredient of innovation. A cyclical improvement process is iterative – incremental changes are incorporated into the knowledge base and into professional practice and feed into the next cycle. There is a compounding effect - change becomes the trigger for more questions. This resource can be used as a contribution to that approach.
31
The evidence-driven decision making cycle
> Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide will reappear throughout (with each step in > bold) to identify what stage of the cycle you are at. Let’s start now at the beginning of the cycle – what triggers the process?
32
Asking questions Evidence-driven decision making
starts with asking good questions You can tell whether a man is clever by his answers. You can tell whether he is wise by his questions. Nobel Prize winner, Naguib Mahfouz Questions are the major theme of this resource. There are probably too many slides here for you to use, but it’s worth spending some time exploring how to get your question right before you start on any data analysis. The quality of any analysis depends on the quality of the question we want answered. So it’s worth spending some time thinking about getting our questions right. Before we start any analysis, we need to write down the questions we want to answer. The questions we ask will determine the selection of evidence and how we analysis it. Where will our questions come from?
33
Formative or summative?
Trigger questions How good/poor is …? What aspects of … are good/poor? Is … actually changing? How is … changing? Is … better than last year? How can … be improved? Why is … good/poor? What targets are reasonable for …? What factors influence the situation for …? What would happen if we …? Formative or summative? Questions that trigger the process can relate to student achievement or behaviour, teaching approaches and school processes - like the ones on this slide. Questions can be described as summative or formative. Let’s think about the questions on this slide. Which of them are summative and which are formative? You could have a discussion about the purpose of summative and formative questions – when is each type of question useful? Summative questions give us end-of-process results, often suitable for reporting and accountability. Formative questions are intended to provide more immediate feedback to improve teaching and learning, so they are probably more specific.
34
Summative questions A target in the school’s annual plan is for all year 10 boys to improve their writing level by at least one level using asTTle (e.g. from 4B to 4A). Have all year 10 boys improved by at least one asTTle level in writing? This slide is about using data and other evidence for strategic planning, as mentioned in earlier slides. Many summative questions will relate to the goals and targets in our school’s strategic plan for improving student achievement - or to goals set within teaching departments or faculties. Most of these questions will be obvious and decided in advance - especially if our strategic goals have been decided on the basis of analysing data. Of course, achieving targets like this relies on deciding how to improve how we teach writing – that’s what formative questions are all about …
35
Questions about policy
We have been running 60-minute periods for 5 years now. What effect has the change had? Many of our questions will be aimed at evaluating a particular policy or procedure, especially if there has been a change.
36
Formative questions from data
The data suggest our students are achieving well in A, but less well in B. What can we do about that? Some questions will come from our routine consideration of available data and evidence or as a consequence of internal or external reviews: national assessment results, asTTle results for a cohort, information from contributing schools, ERO reports, etc.
37
Formative questions from data
A significant proportion of our school leavers enrol in vocational programmes at polytechnic or on-job. How well do our school programmes prepare those students? Some questions will come from our consideration of data and evidence about student destinations. This question should lead us to look back at our school processes data.
38
Questions from hunches
I suspect this poor performance is being caused by … Is this true? We reckon results will improve if we put more effort into ... Is this likely? I think we’d get better results from this module if we added … Is there any evidence to support this idea? The colloquial term hunch is used here to recognise how intuitive teachers can be. The aim is not to belittle hunches. They are extremely useful. In fact most hunches are based on sound professional experience and observation. But until they have been tested against evidence, they remain hunches. In terms of improving particular aspects of teaching and learning, many of the most pertinent questions come from our ‘hunches’. Some of our hunches will be based on a hypothesis or a speculation about a possible change. We’re using the casual term hunch here, but this does not belittle hunches. Teachers, like detectives, base hunches on professional observations – it’s just that you haven’t yet tested them against some evidence.
39
Hunches from raw data This slide is simply to point out that teachers routinely look at raw data and start to get hunches. We return to this table in the Analyse section of the presentation. This table is a set of results for year 12 students in one subject. Most of us would get hunches about trends and issues from a quick read of raw data like this. What trends and issues can we get from just scanning this table? Until careful analysis is done, it’s best to pose these hunches as questions. Some questions are suggested in the next slide.
40
Hunches from raw data Is the class as a whole doing better in internally assessed standards than in externally assessed standards? If so, why? Are the better students (with many Excellence results) not doing as well in external assessments as in internal? If so, why? Is there any relationship between absences and achievement levels? It seems not, but it’s worth analysing the data to be sure. Hunches (expressed as questions) from the previous slide. We return to this table in the Analyse section of the presentation.
41
The evidence-driven decision making cycle
Trigger Clues found in data, hunches > Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? Before we rush into analysing evidence based on a hunch, we need to pause and explore – we should ask ourselves: Is there really an issue here? Do we really know what the issue is?
42
Question – Explore – Question
It looks like our students are doing well in A but not in B. What can we do about it? EXPLORE … what else should we be asking? Is this actually the case? Is there anything in the data to suggest what could we do about it? Our initial question may be general and tentative – often based on a quick look at data. Additional evidence and professional judgement may be needed to be sure that we are onto something. This should lead us to new questions. As we start to explore data and other evidence new questions will arise. In this case, we should do some preliminary analysis of the data to be sure that our impression is justified – then we can look again at the data for possible solutions. Or we can base our actions on as hunch (as in the previous slide). Take the questions from hunches in the previous slide and turn them into multi-barrelled questions that require preliminary exploration before a specifically targeted question is finalised. Another possible multi-barrelled question is in the next slide.
43
Question – Explore – Question
We have been running 60-minute periods for a year now. Did the change achieve the desired effects? EXPLORE … what else should we be asking? How has the change impacted on student achievement? Has the change has had other effects? Is there more truancy? Is more time being spent in class on assignments, rather than as homework? Here’s another example of how we should explore questions before we start to look for evidence. The assumption here is that the change to 60 minute periods was intended to have specific results .
44
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? > Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This is the most crucial stage – to make good evidence-based decisions we need to ask useful questions.
45
A very good question Specific and with a clear purpose
Able to be investigated through looking at data and other evidence Likely to lead to information on which we can act A very good question in evidence-driven decision making is one that will ultimately help us to make a good decision.
46
Questions with purpose
What do we know about reported bullying incidents for year 10 students? MAY BE BETTER AS Who has been bullying whom? Where? What are students telling us? What does pastoral care data tell us? Were some interventions more effective with some groups of students than others? The initial question can be answered quite easily but what use will the answer be? More purposeful questions are likely to lead to information we can act on.
47
Write more purposeful questions
What are the attendance rates for year 11 students? What has been the effect of the new 6-day x 50-min period structure? How well are boys performing in formal writing in year 9? What has been the effect of shifting the lunch break to after period 4? Consider the questions marked in this slide. Write more purposeful questions. You might use just one or two of these – or divide into groups. Some more purposeful questions are offered on the next slide.
48
More purposeful questions
How do year 11 attendance rates compare with other year levels? Do any identifiable groups of year 11 students attend less regularly than average? Is the new 6-day x 50-min period structure having any positive effect on student engagement levels? Is it influencing attendance patterns? What do students say? Should we be concerned about boys’ writing? If so, what action should we be taking to improve the writing of boys in terms of the literacy requirements for NCEA Level 1? The new timing of the lunch break was intended to improve student engagement levels after lunch. Did it achieve this? If so, did improvements in student engagement improve student achievement? Do the benefits outweigh any disadvantages? Some more purposeful questions from the previous slide.
49
The evidence-driven decision making cycle
Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? The group could now discuss evidence available in the school and discuss triggers and questions they could get started on. A template is provided in the next slide.
50
The evidence-driven decision making cycle
EXPLORE QUESTION ASSEMBLE ANALYSE TRIGGER EVALUATE INTERVENE INTERPRET SPECULATE ACT REFLECT This is a blank template that you could use when groups discuss triggers and questions they could get started on. You might like to photocopy this template for groups to fill in. End of presentation. Your next session could look at the next three stages – use Presentation 6 – Getting to Information.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.