Consider the Evidence Evidence-driven decision making

Slides:



Advertisements
Similar presentations
Developing Science Skills. Preparing for Tasks Level DLevel ELevel F individually or in small groups will identify two or three questions to investigate.
Advertisements

Achieving Examination Success
Minnesota State Community and Technical College Critical Thinking Assignment Example and Assessment.
The Foundation Stage Assessment for Learning. Programme Session oneIntroduction Rationale for AfL COFFEE Session twoSharing learning intentions Success.
Planning for Inquiry The Learning Cycle. What do I want the students to know and understand? Take a few minutes to observe the system to be studied. What.
Science Inquiry Minds-on Hands-on.
© Curriculum Foundation1 Section 2 The nature of the assessment task Section 2 The nature of the assessment task There are three key questions: What are.
CHALLENGES AND OPPORTUNITIES FOR CRITICAL ANALYSIS IN ASSESSMENT.
A good research question:
Overall Teacher Judgements
Ian Hodgkinson HMI 19 June 2015
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence A.
ASAP Introduction to Co- construction Meetings. Introduction to Co-construction Meetings In the setting up of ASAP co- construction meetings we should.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
© Curriculum Foundation Part 3 Assessing a rounded curriculum Unit 3 What is the new national curriculum asking for?
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
Evidence-based policies and indicator systems 2006 Conducting effective research and analysis to support policy delivery. The Green Book: Appraisal and.
Successfully recording Continuing Professional Development.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
Literacy I can recall main info, know where to look for it, make inferences linked to evidence, show awareness of characters’ intentions, adapt speech.
Welcome Science 5 and Science 6 Implementation Workshop.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
APPRAISAL OF THE HEADTEACHER GOVERNORS’ BRIEFING.
Critically reviewing a journal Paper Using the Rees Model
The linking learning SET CPD activity Benchmark Reflecting on evidence Self Evaluation Tools (SET)
National Standards in Reading & Writing Sources : NZ Ministry of Education websites. G Thomas, J Turner.
This was developed as part of the Scottish Government’s Better Community Engagement Programme.
© Crown copyright 2008 Subject Leaders’ Development Meeting Spring 2009.
Marking and Feedback CPD Follow up to marking. Expectations and ground rules Respect the views of others Give everyone space to make a contribution All.
Chapter 8:Evaluation Anwar F. Al Arfaj Supervised by Dr. Antar Abdellah Submitted by.
Assessment and PSHE By far the weakest aspect of teaching was the assessment of pupils’ learning which was often less robust for PSHE education than for.
Monitoring & Evaluating WHAT IS MEANT BY MONITORING AND EVALUATING?
Planning AS 2.1 SUS 201 Plan, implement and evaluate a personal action that will contribute towards a sustainable future. 6 credits.
Conducting a research project. Clarify Aims and Research Questions Conduct Literature Review Describe methodology Design Research Collect DataAnalyse.
Recording and reflecting on your learning to aid professional development Learning log guidance and template.
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
Philippines – Australia Basic Education Assistance for Mindanao Beam Pre-service Workshop “Authentic Assessment”
APPRAISAL OF THE HEADTEACHER GOVERNORS’ BRIEFING.
Planning AS 2.1 SUS 201 Plan, implement and evaluate a personal action that will contribute towards a sustainable future. 6 credits.
Provide instruction.
Research Skills.
Consider the Evidence Evidence-driven decision making
Lecture 3: Procedures of extension work
GCSE Food Preparation and Nutrition
Developing effective faculty processes for quality assessment
Reports Chapter 17 © Pearson 2012.
In-Service Teacher Training
Consider the Evidence Evidence-driven decision making
Introducing the Numeracy continuum K-10
Social Media and Networking for a University
Consider the Evidence What is meant by ‘data and other evidence’?
Reviewing your final digital product
Consider the Evidence Evidence-driven decision making
Review of Progress Meeting (weekly)
The Impact of Social Media
CATHCA National Conference 2018
Consider the Evidence Evidence-driven decision making
In-Service Teacher Training
The PDCA Cycle.
KS1 SATS Guidance for Parents
The Assessing Cycle Module 1 Assessment Assessment Accountability Data
Fahrig, R. SI Reorg Presentation: DCSI
MY STRENGTHS & AREAS FOR DEVELOPMENT
KS1 SATS 2019 at Olton Primary KS1 SATS Guidance for Parents
KS1 SATS Guidance for Parents
Role of the Internal Verifier
Presentation transcript:

Consider the Evidence Evidence-driven decision making for secondary schools A resource to assist schools to review their use of data and other evidence 7 Changing and Evaluating

Evidence-driven decision making This module is part of a resource about how we use data and other evidence to improve teaching, learning and student achievement Today we are looking at the final stage of this process – changing our practice and evaluating the impact of that change This session will help us think about how we apply what we have discovered from our analysis of evidence – remember the aim is to uncover ways to change future practice to improve student achievement. We can apply today’s material to student achievement at all secondary levels - not just senior – and to all curriculum learning areas and school processes. This material is part of a resource explaining evidence-driven decision making. What is evidence-driven decision making? This resource does not justify the use of data analysis in schools – the assumption is that teachers and schools already understand the benefits and to some extent already do it. This resource does not provide data processing tools or directions on how to analyse data. These are readily available.

The evidence-driven decision making cycle Explore Check data and evidence to explore the issue Question Clarify the issue and ask a question Assemble Decide what data and evidence might be useful Analyse data and evidence Trigger Data indicate a possible issue that could impact on student achievement Evaluate the impact on the intervention Intervene Plan an action aimed at improving student achievement Interpret Insights that answer your question Speculate A teacher has a hunch about a problem or a possible action Act Carry out the intervention Reflect on what has been learned, how practice will change This is the full evidence-driven decision making cycle. Today we are looking at the boxes labelled Intervene, Evaluate and Reflect.

Changing and Evaluating Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? This slide lists the stages in the cycle and gives simple explanations. You might prefer to print handout copies of the text below or the cycle diagram in the previous slide. The cycle has these sequential stages – today we are going to look at the final three stages: Intervene Design and implement a plan of action designed to change the situation you started with. Be sure that your actions are manageable and look at the resourcing needed. Consider how you’ll know what has been achieved. Evaluate Using measures you decided in advance, assess how successful the intervention has been. Has the situation that triggered the process been improved? What else happened that you maybe didn’t expect? Reflect Think about what has been learned and discovered. What did we do that worked? Did this process suggest anything that we need to investigate further? What aspects of the intervention can be maintained? What changes will we make to our practices? What support will we need?

The evidence-driven decision making cycle Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? > Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change? We have asked questions, analysed data another evidence and have some information we can act on. Now we decide what we will change.

Professionals making decisions How do we decide what action to take as result of the information we get from the analysis? We use our professional judgment This is an introductory slide – the aim is simply to point out that the analysis will not always point to an obvious intervention. Teachers still need to make professional decisions. What we decide to do as result of the information we get from the analysis is guided by our professional experience.

Professional decision making We have evidence-based information that we see as reliable and valid What do we do about it? If the information indicates a need for action, we use our collective experience to make a professional decision What we decide to do as result of the information we get from the analysis is guided by our professional experience.

Professionals making decisions Have my students not achieved a particular history standard because they have poor formal writing skills, rather than poor history knowledge? The answer was Yes ... so I need to think about how to improve their writing skills. How will I do that? First, we go back to our trigger point - to the question we asked that led to the analysis. In this case, the question was quite simple and the information convincing – so what we have to achieved has been established. Exactly how we do that is the professional decision.

Professionals making decisions Do any particular groups of year 11 students attend less regularly than average for the whole cohort? The analysis identified two groups – so I need to think about how to deal with irregular attendance for each group. How will I do that? Another example: The question we asked that led to the analysis was simple enough and the resulting information identifies where the action should be applied. Exactly what we do is the professional decision.

Professionals making decisions You asked what factors are related to poor student performance in formal writing. The analysis suggested that poor homework habits have a significant impact on student writing. You make some professional judgements and decide Students who do little homework don’t write enough You could take action to improve homework habits – but you’ve tried that before and the success rate is low You have more control over other factors – like how much time you give students to write in class So you conclude – the real need is to get students to write more often Teachers decided the poor performance in writing was not homework habits but the total amount of writing students do. Professional judgments lead to the conclusion that the action required is to ensure that students do more writing. NOTE – This resource does not provide advice on writing an action plan – see the Consider the Evidence web pages for resources on writing action plans.

Deciding on an action Information will often suggest a number of options for action. How do we decide which action to choose? We need to consider what control we have over the action the likely impact of the action the resources needed These are the factors we need to consider when we are planning an intervention: Control - What aspects of the situation do we have most control over? Do we run a limited pilot rather than a full scale intervention. Impact - What can we do that is most likely to have the desired impact? Do we play it safe and intervene only where we know we can make a major difference? Resources - time, money, people - What will we need? What do we have? What other activities could be affected if we divert resources to this project?

Planning for action Is this a major change to policy or processes? What other changes are being proposed How soon can you make this change? How will you achieve wide buy-in? What time and resources will you need? Who will co-ordinate and monitor implementation? More factors to consider when we are planning an intervention.

Planning for action Is this an incremental change? Or are you just tweaking how you do things? How will you fit the change into your regular work? When can you start the intervention? Will you need extra resources? How will this change affect other things you do? How will you monitor implementation? More factors to consider when we are planning an intervention.

Timing is all How long should we run the intervention before we evaluate it? When is the best time of the year to start (and finish) in terms of measuring changes in student achievement? How much preparation time will we need to get maximum benefit? Remember that we are carrying out this action to see what impact it has. If our evaluation of the impact is to be meaningful, when we do it could be crucial.

Planning for evaluation We are carrying out this action to see what impact it has on student achievement We need to decide exactly how we’ll know how successful the intervention has been To do this we will need good baseline data We need to decide in advance how we will evaluate the impact. To do this we will need valid and reliable baseline evidence. The baseline evidence we need for evaluation might be different from the data we analysed earlier in this project.

Planning for evaluation What evidence do we need to collect before we start? Do we need to collect evidence along the way, or just at the end? How can we be sure that any assessment at the end of the process will be comparable with assessment at the outset? How will we monitor any unintended effects? Don’t forget evidence such as timetables, student opinions, teacher observations … Some questions to consider as we think about the data we will need to evaluate the impact of our intervention.

The evidence-driven decision making cycle Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action > Evaluate What was the impact? Reflect What will we change? We now leap to the final stages of the processes – we have carried out the intervention and evaluated its impact against baseline data.

Evaluate the impact of our action Did the intervention improve the situation that triggered the process? If the aim was to improve student achievement, did that happen? Now we have to decide how effective the intervention has been. These are the central questions.

Evaluate the impact of our action Was any change in student achievement significant? What else happened that we didn’t expect? How do our results compare with other similar studies we can find? Does the result give us the confidence to make the change permanent? When we evaluate the impact of our intervention, we need to ask the same sort of questions that we asked earlier in the process. That is, we need to interrogate our evaluation. We must not leap to conclusions. The final question is the crucial one: Has the intervention been effective enough to justify incorporating the change into our normal practice?

Evaluate the impact of our action A school created a new year 13 art programme. In the past students had been offered standard Design and Painting programmes, internally and externally assessed against the full range of achievement standards. Some students had to produce two folios for assessment and were unsure of where to take their art after leaving school. The new programme blended drawing, design and painting concepts and focused on electronic media. Assessment was against internally assessed standards only. Where we simply change our teaching approach, we’ll compare student achievement data to evaluate the impact of the change. But more radical changes are more difficult to evaluate - a wider range of evaluation approaches will be needed. How could the school evaluate the impact of this change? Suggestions on the next slide.

Evaluate the impact of our action Did students complete more assessments? Were students gain more national assessment credits? How did student perceptions of workload and satisfaction compare with teacher perceptions from the previous year? Did students leave school with clearer intentions about where to go next with their art than the previous cohort? How did teachers and parents feel about the change? Some suggestions from the previous slide. Bullet 2: If the school had decided on the change earlier they could have collected student perceptions evidence from the previous cohort. Bullet 3: For this measure the school was able to collect evidence from the previous cohort.

Evaluate the intervention How well did we design and carry out the intervention? Would we do anything differently if we did it again? Were our results affected by anything that happened during the intervention period - within or beyond our control? Did we ask the right question in the first place? How useful was our question? How adequate were our evaluation data? Whether the answer to the last question on the previous slide (Does the result give us the confidence to make the change permanent?) is Yes or No, we should think about the intervention itself. These questions apply particularly if our intervention seems not to have worked – maybe there was a reason for that. Maybe it was not a complete waste of time.

Think about the process Did we ask the right question in the first place? How useful was our question? Did we select the right data? Could we have used other evidence? Did the intervention work well? Could we have done anything differently? Did we interpret the data-based information correctly? How adequate were our evaluation data? Did the outcome justify the effort we put into it? Before we move on, we should look back over the process to see what we can learn from it.

The evidence-driven decision making cycle Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? > Reflect What will we change? This is the final step: we have trialed a change to our practice – how much of that can we embed in future practice?

Future practice What aspects of the intervention will we embed in future practice? What aspects of the intervention will have the greatest impact? What aspects of the intervention can we maintain over time? What changes can we build into the way we do things in our school? Would there be any side-effects? Even if things didn’t go exactly as we planned them – even if student achievement wasn’t greatly improved - there are probably some things we have learnt that we should incorporate into future practice. We need to be realistic about what we can achieve – we need to be sure we could maintain the intervention. We should also think about any side-effects – if we put time and effort into this change, will anything else suffer?

Future directions What professional learning is needed? Who would most benefit from it? Do we have the expertise we need in-house or do we need external help? What other resources do we need? What disadvantages could there be? When will we evaluate this change again? Before we decide to go ahead and embed aspects of the intervention into our practice, we need to look at all ramifications – and about starting the cycle all over again …. End of presentation. The next step should be for the group to discuss how this model can be applied in this school, department or faculty. This should include a discussion about what evidence already exists in the school, how this is collected and recorded, and how well equipped the school is to analyse and use it in the interests of improving student achievement. Finally, the group should consider what evidence-driven projects the school could undertake. Beware of collecting data and other evidence that you might not need. If the school thinks ahead about how to make evidence-based decisions, you will know what data and other evidence you should collect.