Download presentation
Presentation is loading. Please wait.
Published byWesley Shelton Modified over 9 years ago
1
Evidence-based Practice (EBP) – an introduction Three objectives: 1. What is EBP? 2. Where has EBP come from (and why)? 3. Risks and Rewards of EBP?
2
What is EBP? Where has EBP come from (and why)?
3
EBP – a definition The term evidence-based practice (EBP) refers to preferential use of mental and behavioural health interventions for which systematic empirical research has provided evidence of statistically significant effectiveness as treatments for specific problems. Wikipedia
4
Box of shocks This hand-cranked electrotherapeutic machine was designed in the early 1900s so that patients could give themselves shock therapy in the comfort of their own home. Some used it for tooth ache, while others used it to ease nerve pain and tics.
6
EBP – the intent Evidence based practice (EBP) is an approach which tries to specify the way in which professionals or other decision-makers should make decisions by identifying such evidence that there may be for a practice, and rating it according to how scientifically sound it may be. Its goal is to eliminate unsound or excessively risky practices in favour of those that have better outcomes. Wikipedia
9
“Facts are meaningless. You could use facts to prove anything that's even remotely true!” Homer Simpson
10
EBP in other contexts Defence Human relations (HR)
14
EBP in NZ schools Legislative context: NAG1 NAG2 Education Standards Act Ministry advice Planning for better student outcomes Consider the evidence
21
Consider the Evidence Evidence-driven decision making for secondary schools A resource to assist schools to review their use of data and other evidence Evidence-driven decision making
22
The evidence-driven decision making cycle. Explore data Survey of students shows that this is only partially true Question What are the characteristics of students who are poor at writing ? Assemble more data & other evidence: asTTle reading, homework, extracurric, Attendance, etc Analyse NQF/NCEA results by standard Trigger Some of our students are poor at writing Analyse non NQF/NCEA data and evidence Intervene Create multiple opportunities for writing; include topics that can use sport as context; connect speaking and writing. PD for staff. Interpret information Poor writers likely to play sport, speak well, read less, do little HW A teacher has a hunch - poor writers might spend little time on homework Evaluate Has writing improved? Reflect How will we teach writing in the future?
23
Evidence-driven strategic planning. INDICATORS FROM DATA asTTle scores show a high proportion of Yr 9 achieving below curriculum level NCEA results show high non- achievement in transactional writing Poor results in other language NCEA standards etc STRATEGIC GOAL To raise the levels of writing across the school Strategic action Develop a writing development plan which addresses writing across subjects and levels, including targets, professional development and other resourcing needs etc. ANNUAL PLAN Develop and implement a plan to raise levels of Writing at year 9 Development plan to be based on an analysis of all available data and to include a range of shared strategies etc. YEAR TARGET Raise writing asTTle results Yr 9boys from 3B to 3A etc. Appraisa l P D Self review School charter EVALUATION DATA asTTle writing results improve by … Perception data from year 9 staff indicates … Evaluation of effectiveness of range of shared strategies, barriers and enablers … etc.
24
EBP – a definition The term evidence-based practice (EBP) refers to preferential use of mental and behavioural health interventions for which systematic empirical research has provided evidence of statistically significant effectiveness as treatments for specific problems. Wikipedia
26
What is EBP? Where has EBP come from (and why)?
27
Risks and Rewards of EBP For students For teachers For schools
28
Not everything that counts can be counted, and not everything that can be counted counts! Albert Einstein Do we measure what we value or are we valuing what we can measure? Julia Atkin, 2004
29
ASSESSMENT TENSIONS and DILEMMAS: quantity quality objective subjective standardiseraise standards individualcollaborative material spiritual technical ‘soulful’ formal informal simple complex Julia Atkin, 2004
30
A rationalist, positivist ‘world view’ focuses on the left hand side of the previous slide; focuses on aspects which are tangible and that can be quantified and measured on linear scales. Our challenge is that deep learning, learning with spirit, dynamic learning, transformative learning embraces both sides … it is all about the integration of the two columns. It is not ‘either-or’ - it’s ‘both - and’. I encourage you to re-think assessment by clarifying what we value and believe about learning,teaching and assessing - by starting with WHY? Why assess? What do we value and believe about learning & teaching and what are the implications for assessment? Julia Atkin, 2004
31
‘The right hand side of the tension point list demands qualitative forms of assessment. In addition to performance (as in authentic assessment) qualitative forms of assessment involve holistic media such as narrative and image Julia Atkin, 2004
33
What did I learn today? My mother will want to know.
35
Can assessment raise standards? Recent research has shown that the answer to this question is an unequivocal ‘yes’. Assessment is one of the most powerful educational tools for promoting effective learning. Kay Hawk, 2006
36
But it must be used in the right way. There is no evidence that increasing the amount of testing will enhance learning. Instead the focus needs to be on helping teachers use assessment, as part of teaching and learning, in ways that will raise pupils’ achievement. Kay Hawk, 2006
37
The research tells us that successful learning occurs when learners: have ownership of their learning understand the goals they are aiming for are motivated and have the skills to achieve success. Kay Hawk, 2006
38
Earl, L. (2006)
39
Three types of assessment: Diagnostic Formative Summative
40
Earl, L. (2006) Done for students Done with students Done to students
42
Earl, L. (2006)
44
1. Understand theory and principles 2. Use current research 3. Cut out unnecessary or unused assessment 4. Embed good formative practice 5. Visit other schools 6. Explode the myths 7. Empower students Kay Hawk, 2006 What do schools need to do to maximise the benefits of good assessment practice?
45
Risks and Rewards of EBP? For students? For teachers? For schools?
46
Good News range of effective tools good research on what makes a difference MOE assessment contracts Cluster initiatives to help bridge the educational islands NCEA systems which allows flexibility Kay Hawk, 2006
47
A final thought: Earl, L. (2006)
48
Evidence-based Practice (EBP) – an introduction What is EBP? Where has EBP come from (and why)? Risks and Rewards of EBP?
49
References: Atkin, J., (2004). Reassessing assessment: Beyond benchmarking the benchmarks: NavCon 2k4 Conference Earl, L., (2006). Rethinking classroom assessment with purpose in mind: Aporia Consulting Ltd. OISE/UT Hammersley, M., (2001). Some questions about evidence-based practice in education: Annual Conference of the British Educational Research Association Hawk, K., (2006). Assessment in 2006: What? and how?: ULearn06 Conference Ministry of Education. (n.d.) Consider the evidence. Retrieved from http://www.tki.org.nz/r/governance/consider/index_e.phphttp://www.tki.org.nz/r/governance/consider/index_e.php Ministry of Education. (2003) Planning for better student outcomes. Retrieved from http://www.minedu.govt.nz/~/media/MinEdu/Files/EducationSectors/PrimarySecondary/SchoolOpsPlanningReporting/PlanningB etterStudentOutcomesSept2003.pdf http://www.minedu.govt.nz/~/media/MinEdu/Files/EducationSectors/PrimarySecondary/SchoolOpsPlanningReporting/PlanningB etterStudentOutcomesSept2003.pdf
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.