8th eSTEeM Annual Conference May 2019

Slides:



Advertisements
Similar presentations
Supported self-evaluation in assessing the impact of HE Libraries Sharon Markless, King’s College London and David Streatfield, Information Management.
Advertisements

Reflective Practice Leadership Development Tool. Context recognised that a key differentiator between places where people wanted to work and places where.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
The Graduate Attributes Project: a perspective on early stakeholder engagement Dr Caroline Walker Queen Mary, University of London.
School Development Planning Initiative “An initiative for schools by schools” Self-Evaluation of Learning and Teaching Self-Evaluation of Learning and.
This project is implemented through the CENTRAL EUROPE Programme co-financed by the ERDF. Work Package 4 & Methodology for Open Living Lab O4.1.7 Budapest,
The Art of the Designer: creating an effective learning experience HEA Conference University of Manchester 4 July 2012 Rebecca Galley and Vilinda Ross.
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
School Improvement Partnership Programme: Summary of interim findings March 2014.
Results of WP 4: Implementation, experimentation of teacher training actions Open University of Catalonia - From November 4th to December 19th.
PRESENTATION AT THE TECHNOLOGICAL UNIVERSITIES QUALITY FRAMEWORK Professor Sarah Moore, Chair, National Forum for the Enhancement of Teaching and Learning.
Training for organisations participating in Peer Review of Paediatric Diabetes.
Conducting a research project. Clarify Aims and Research Questions Conduct Literature Review Describe methodology Design Research Collect DataAnalyse.
Scotland’s Colleges is a trading name of both the Scottish Further Education Unit and the Association of Scotland’s Colleges Curriculum for Excellence.
Lisa Gray and Paul Bailey Technology to support 21 st century tutoring.
The Blended Learning Project. Session Objective  Introduce the Blended Learning Project  Explore and experience SOLA packs that have already been created.
Stages of Research and Development
Digital transformation, which often includes establishing big data analytics capabilities, poses considerable challenges for traditional manufacturing.
Title of presentation Copyright IDS and MeTA 2010
Knowledge Transfer Partnership Project Nottingham Trent University and Nottinghamshire County Council Dr Adam Barnard Rachel Clark Catherine Goodall 19/4/16.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Implementing THRIVE Phase 1: Developing a full understanding of your current system “If we keep on doing what we have been doing, we are going to keep.
Southampton City Council School School Improvement Service
Monitoring, Evaluation and Learning
Impact for Access conference 5th September
MODULE 11 – SCENARIO PLANNING
Eleventh hour evaluation
Understanding Standards: Nominee Training Event
WEEK 1 – RESEARCH PROPOSAL
Links in the Chain: turning learning analytics data into actions
Participatory Action Research (PAR)
Research Methods RESEARCH PROPOSAL.
HR0277 Change, Work and Diversity
PURPOSE AND SCOPE OF PROJECT MONITORING AND EVALUATION M&E.
HEALTH IN POLICIES TRAINING
EDU 695 STUDY Lessons in Excellence-- edu695study.com.
Lines of Inquiry in our PPA
Training Trainers and Educators Unit 5 – Effective Group Work
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
Training Trainers and Educators Unit 5 – Effective Group Work
NU3S01 Research for Nursing Practice
Overview of working draft v. 29 January 2018
PE and Impact – using the RDF to identify and develop the skills required Thursday, 28 February Heather Pateman, Project Manager, Vitae.
Linking assurance and enhancement
Implementation in a Transition Year
Implementation in a Transition Year
Implementation in a Transition Year
School of Dentistry Education Research Fund (SDERF)
Mixed Method Research for Policy Impact
Learning Module 11 Case Study Research.
eSTEeM Induction event, 1st November 2018
Implementation in a Transition Year
CATHCA National Conference 2018
7th eSTEeM Annual Conference
Integrating digital technologies: Three cases of visual learning Professor Robert Fitzgerald Charles Darwin University IRU Senior Leaders Forum 2018.
Monitoring, Evaluation and Learning
Thinking Skills Approaches
The Hub Innovation Program Evaluation Plan
Leadership for Safety Through the Case Method
Introduction to Quality Improvement Methods
TLAP Partnership Meeting 7th June 2017
ROLE OF «electronic virtual enhanced research-engaged student teams» WEB PORTAL IN SOLUTION OF PROBLEM OF COLLABORATION INTERNATIONAL TEAMS INSIDE ONE.
Catherine Beswick University of Nottingham April 2019
Evaluating WP initiatives: Overcoming the Challenges
Workshop Set-Up: The aim is that at each table we have a variety of disciplines / subjects represented by (ideally) four participants. Ensure a mixture.
What aspects of a team make it a Community of Practice?
Overview Purpose/ Why they did the work Delivery Learning Outcomes
Wide Ideas Idea Management Software Idea Management Process
8th eSTEeM Annual Conference May 2019
Presentation transcript:

8th eSTEeM Annual Conference 8 - 9 May 2019 Using technology-enabled learning networks to drive module improvements in STEM Lesley Boyd, PhD Researcher, IET Rob Janes, Module Chair S215 Tom Olney, Senior Manger Teaching & Learning, STEM   8th eSTEeM Annual Conference 8 - 9 May 2019

Using technology-enabled learning networks to drive module improvements in STEM An action research story of collaborative participation in problem solving and improvement, and putting ALs ‘close to the solution’. What’s a learning network? Methodology A brief history of the project Phase 1 (17J) & Phase 2 (18J) Results so far What are the next steps?

What’s a learning network? Task driven Technology-enabled Collaborative Structured Connecting together disparate practitioners across our different contexts and boundaries, eg ALs, module teams, staff tutors, Learning Design Aiming for a particular practical improvement outcome No known right answer

Methodology Technology-enabled participatory action research Learn together in an unfolding and emergent process Equitable Collaborative Joint ownership: discussion, action planning, implementation & evaluation Underpinned by Grounded Theory Method (GTM) Exploring a new conceptual framework regarding the unfolding process, and driving round the progressive action research cycles in a structured and rigorous manner.

Action research table top model Source: Coghlan, D. and Brannick, T. (2014) Doing Action Research in Your Own Organisation, London, Sage.

A collaborative conversation Image: Getty Images/iStockphoto

Methodology – why GTM? A theory-building, not theory-testing or theory-verification methodology. Systematic data collection and analysis. ‘Far better to allow the data to tell its own story in the first instance, build a theory, then, subsequently, engage your theory with the theory that you thought you might impose initially. You can see if your emergent theory confirms or challenges existing theories. So, potentially GTM has a huge role to play in theory building, in all disciplines’. Urquhart, C. (2013) Grounded Theory for Qualitative Research: A Practical Guide, London, Sage. Sannino and Engeström (2017) describe ‘looking in vain’ for recent discussions of ‘theoretically and methodologically ambitious approaches’ of intervention research in major journals. Sannino, A. and Engeström, Y. (2017) Co-generation of societally impactful knowledge in Change Laboratories. Management Learning, vol.48, no.1, pp.80-96.

Brief history of the project Phase 1: learning networks hosted in dedicated VLE sites for each of three pilot modules on Tricky Topics. discussion forums and online workshops used to seek feedback from tutors, in order to collaboratively identify Tricky Topics and suggest improvements or produce learning interventions. S215 ALs and the module team very successfully identified a list of conceptual Tricky Topics, plus a list of additional issues including pace and volume of material. ALs designed and implemented four innovative Tricky Topics intervention videos, which have been in use on the 17J and 18J module website, shared with other modules and emulated elsewhere. Phase 2: second cycle of collaborative work building on the analysis from the first cycle in S215. Since tutors had identified concerns about pace and volume of material, an online workshop and discussion forum shared and interpreted specialised learning design analytics visualisations with tutors with the aim of identifying areas for further new interventions.

Learning Network site

Using learning design analytics The three visualisations discussed with S215 ALs were: expected student workload by activity type comparison of expected student workload and MT advised workload per week comparison of expected student workload by activity type and average VLE engagement

Results so far Phase 2 The learning network discussions have highlighted a number of issues, represented in an interactive spreadsheet, which organises the supporting qualitative evidence. These issues include pace and volume of material, prerequisite knowledge and online / offline study behaviour. All these issues have contributed towards the planning of four actions: production and trialling of ‘signposting’ material for Blocks 9 and 10 finding out more re student preparedness and study choices before S215 finding out more re online / offline study behaviour, and which resources students download clarifying issues re the use of OU Analyse The trial signposting materials have been produced by an AL, reviewed by the module team, and implemented in the current presentation.

Analysis – interactive spreadsheet Two years of qualitative feedback from the 17J and 18J learning network discussion forums, which is now represented for both years in a new navigable interactive spreadsheet.

Results so far Phase 2 Two Study Pathway Analysis reports for 17J and 18J, have been produced for the project by a STEM Data Wrangler. These reports illustrate the module combinations and presentations taken by 17J and 18J students before S215. The Study Pathway Analysis reports illustrate an extremely scattered picture of previous study pathways, taken over many years. In 18J, out of 160 students in total, there were 71 different pathways; all but the first 7 were unique to one student. 32%, or 51/160 students followed the recommended route of S111+S112. The Module Chair has collected some informal feedback from a selection of students at the recent Residential Schools, with initial positive feedback for the trial signposting documents and underscoring the concerns over pace and volume of material.

Next steps The next stage is currently being undertaken: to consider all the data and possible actions, and plan for some questions direct to students, using the Real Time Student Feedback Tool (RTSF), before the students sit their exam in early June. The project now has organised qualitative longitudinal evidence from the ALs, staff tutors and module team in 17J and 18J, and looking to underscore this with direct feedback from students. Students will be asked during the RTSF whether they would like to provide further individual or discussion group feedback post exam. Evaluations: effectiveness of each action research cycle collaboration and joint ownership integration of evaluations at intervention level, module level, qualification level, and organisational learning or systemisation level Finally we will evaluate to assess whether this approach can be extended to other modules (one is already under consideration).

Next steps The acquired data and qualitative feedback may help inform future module wide improvement actions, adjustments to the learning design, provide support to inform AL teaching practice, and answer the questions and issues raised by their participation in the learning network thus far. All the data, learning analytics visualisations, discussion forums and analysis of the issues with supporting evidence are held in a dedicated VLE site for S215, which is accessible by all participants and stakeholders. The grounded theory analysis will be extended to consolidate and strengthen the emerging conceptual framework of technology-enabled organisational learning, and compared back to other existing and emerging conceptual frameworks in the literature. Evaluation of use of GTM to explore a new conceptual framework and drive round progressive AR cycles in a structured and rigorous manner. Does it yield actionable knowledge, which is usable by practitioners whilst being sufficiently theoretically robust? (Coghlan and Brannick, 2014).

Thank you Any questions? lesley.boyd@open.ac.uk rob.janes@open.ac.uk tom.olney@open.ac.uk