Getting Practical Science transition project

Slides:



Advertisements
Similar presentations
Diffusion of Innovation Everett M. Rogers, 1995 (4 th edition) Diffusion is the process by which (1) an innovation (2) is communicated through certain.
Advertisements

Diffusion of Innovation How New Ideas, Practices, and Technologies Spread Content from
Diffusion of Innovation Theories, models, and future directions.
Understanding Educational Innovation Professional Practice Module Dr Sue Wharton.
Exploration into the barriers and obstacles constraining diffusion and adoption of renewable energy solutions Saskia Harkema and Mirjam Leloux Wittenborg.
Getting Practical Science transition project Train the trainer session Dec 2012.
Improving Practical Work in Science Session 1 Secondary.
Getting Practical Science transition project Session 2.
Diffusion of Innovations Gerontology 820 Ashley Waldoch October 18, 2010.
Ensuring that Professional Development Leads to Improved Mathematics Teaching & Learning Kristen Malzahn Horizon Research, Inc. TDG Leadership Seminar.
Diffusion of innovation Theory and concepts. Diffusion of Innovation Everett Rogers (1995) defined innovation diffusion as ‘the process by which an innovation.
Developing web-based CPD for group work and argumentation in science Session 2: Developing Argumentation 12/01/12.
Factors affecting the adoption of e-assessment in School of Engineering by Z.G.Baleni Continuous Professional Development Unit CLTD, WSU 1/4/20161.
1 Fostering Change to successfully introduce and scale up proven health policies and practices Lourdes de la Peza Bangkok, 2010.
Diffusion of innovation Everett M. Rogers. It is a theory called Diffusion of Innovation invented by Everett Rogers.
CYPS – Foundation Degree How to carry out a swot analysis.
Systems Thinking, Systems Changing; Implementing Green Clean Jess Lawrence
Stages of Research and Development
SATs KS1 – YEAR 2 We all matter.
Continuous improvement through collaborative development
GIVING FEEDBACK ON PERFORMANCE CONCERNS IN A 1:1 MEETING -
Provide instruction.
Resource 1. Involving and engaging the right stakeholders.
Developing and Managing Products
Copyright © 2007 McGraw-Hill Ryerson Limited
Address the complexity in a Structured way
ALIGN AND REDESIGN.
Measuring Market Opportunities: Forecasting and Market Knowledge
Diffusion of Innovation Theory
Diffusion of Innovation
Diffusion of innovation
Diffusion of innovation
In-Service Teacher Training
The Learning Cycle 1 Prepare for learning 2 6 Review – Step back
Governor Visits to School
Evaluating performance management
World of work How do tasks bring the WoW into the classroom?
Classroom Assessment Validity And Bias in Assessment.
Louis D. Bilionis Dean Emeritus and Droege Professor of Law
Building Buy In Amanda Norton, MSW Improvement Advisor.
Q uality uestioning Materials adapted from QUILT curriculum:
RESEARCH IMPLEMENTATION PRACTICE
THINK Public Relations
Building Buy In Amanda Norton, MSW Improvement Advisor.
Educational Technology Conference
Inter-school Training and Agreement Trialing using the P Scales
Instructional Learning Cycle:
Structures for Implementation
General Notes Presentation length - 10 – 15 MINUTES
Diffusion of Innovation
Introduction to the Global Learning Programme for England
Diffusion of Innovation
Final Research Question
GIVING FEEDBACK ON PERFORMANCE CONCERNS IN A 1:1 MEETING -
Part 2 of ‘Starting to Lead: An introduction to middle leadership’
Chapter 16 Planning and Management of Health Promotion
Exploring effective feedback to pupils
Using Data for Program Improvement
Inclusion and Every Child Matters
Using Data for Program Improvement
Workshop 4 Planning next steps.
Governor Visits to School
TA Toolkit Teacher Session
Science Network Pilot Evaluation for …School
Diffusion of Innovation
Lecturette 2: Planning Change
Inquiry based learning IBL in mathematics
Early help: councillor training
Preparing students for assessments Janet Strain Ann Jakeman
A data-driven, multi-disciplinary approach to understanding student non-engagement with employability initiatives Presenters: Dr Stephanie McBurney, Faculty.
Presentation transcript:

Getting Practical Science transition project Session 3

Session 3 Outcomes Review the evidence to show effectiveness of a sample of practical activities. Consider the best model for disseminating training in schools and across clusters. Customise the training package so that it meets the needs of individual institutions.

Reflecting on the gap task In Key Stage groups discuss the feedback from pupils. Be prepared to feedback key points to the rest of the group.

Reflecting on the effectiveness of practical work Teacher's objectives what the pupils are intended to learn Effectiveness at Level 1 Did pupils do what they were intended to do (and see the things they were meant to see)? Effectiveness at Level 2 Did pupils learn (and can later show understanding of) what they were intended to learn? B. Task specification what the pupils are intended to do Effectiveness C. Classroom events what the pupils actually do The graphic on this slide is taken from Robin Millar’s work, including Analysing Practical Science Activities to assess and improve their effectiveness, Association for Science Education (2010). The ideas behind it underpin the aims of the project, which are to consider the effectiveness of any practical activity, by which we mean: (a) are the pupils doing the things they were intended to do (and hence see the things they were meant to see); are the pupils learning (and can later recall, or demonstrate understanding of) the things they were meant to learn. Box A refers to the learning outcomes we are aiming to cover in a particular learning episode. Box B refers to the task which we choose, and the instructions we give, in order to achieve the learning outcomes. Box C refers to what our pupils actually do during the task – this may not be what we intended in specifying the task (Box B). Box D refers to what pupils actually learn from the task – again this may not always be what we intended them to learn (Box A). The two levels of effectiveness are opportunities for reflection on the effectiveness of practical work. Problems at level 1 might involve changes in the staging of the task, or intervention during the task. Problems at level 2 might involve changing the task altogether in order to better achieve the intended learning outcomes. These ideas will be explored further so there is no need to spend more than a couple of minutes discussing them now. However, it might be worth asking delegates what they think of this flow chart. Some questions you might ask include: Do these four boxes accurately represent the process we go through when choosing and/or designing a practical? Who decides on the intended learning outcomes (ie Box A) for any given practical? Why would pupils not necessarily do what we expected them to do when designing the task? Does this matter? If so why (or why not)? Why would pupils not necessarily learn what we expected them to learn from the practical? Again, does this matter? If so, why? 1 D. Learning outcomes what the pupils actually learn 2

Presenting your vision for science Reflect on your vision for science in your school. Decide what training implications this has. Prepare a pitch to deliver to your Leadership team and Governors about the future of science in your school – use the prompt cards to help you.

Triad madness! Get into triads and present your pitch to each other. Discuss pitches give feedback identify any problems with implementing training. suggest possible solutions.

Overcoming barriers to change Categories of adopters of innovations Many of you are familiar with Rogers’ diffusion of innovation theory, even if you didn’t know its formal name. Dr. Everett Rogers is probably most famous for popularizing the following diagram: [from http://susanlucas.com/it/images/categories.gif] We often think of this bell curve when we initiate new technology initiatives: Who are the innovators that will jump at this first? When do we start involving folks other than the early adopters? How do we get the rest of the folks (i.e., the late majority and the laggards) on board? And so on… But Rogers also talked about how the adoption of any innovation (i.e., change) tends to occur in five stages: awareness, interest, evaluation, trial, and adoption. And, importantly, he also discussed what he called perceived characteristics of innovations. These are things considered by potential adopters that affect how likely those potential adopters are to move from awareness to adoption. They are: relative advantage (the ‘degree to which an innovation is perceived as being better than the idea it supersedes’); compatibility (‘the degree to which an innovation is perceived to be consistent with the existing values, past experiences and needs of potential adopters’); complexity (‘the degree to which an innovation is perceived as difficult to use’); trialability (‘the opportunity to experiment with the innovation on a limited basis’); and observability (‘the degree to which the results of an innovation are visible to others’). Innovations that have greater relative advantage, compatibility, trialability, and observability, along with less complexity, generally will be adopted over innovations that do not. Numerous school technology initiatives fail to result in widespread changes in educator practice. One prevalent reason is because they did not adequately address the very rational concerns that educators have about one or more of these perceived characteristics of innovations. Anyone who is trying to make wide-scale change happen in their school system must address these sufficiently to alleviate the concerns of the late majority and laggards. Otherwise only the innovators and the early adopters will jump on board, along with some, but not all, of the early majority. I’m sure all of you can think of instances of this and hope you will share some in the comments area. [from http://susanlucas.com/it/images/categories.gif]

What are the qualities that make innovations spread? Relative advantage Compatibility with existing values and practices Simplicity Trialability Observable results Reasons innovations fail High Cost / Low benefit – people don’t want to invest money or time in them, usually because they perceive the costs outweigh the benefits. Communication – poor marketing or explanation (eg. fails to address WIIFM). Demand – there may be no market or audience.

Designing a CPD package to use in your school Use the proforma and website to design your CPD package. http://gettingpractical.wikispaces.com/

Discussion and Feedback In small groups discuss your CPD plan. Are there any points from other plans that you would like to include in yours?