Christine Gardner, Allan Jones, David Chapman, Helen Jefferis

Slides:



Advertisements
Similar presentations
its impossible to get everything into every record. keep your eyes on the child, not on the printed page its not who much you record, but what and.
Advertisements

Bridging the Gap between School and University: ‘Get Ready for Languages’ Kirsten Söntgens International Blended Learning Conference 16 th June 2011.
APT APC 2015 LAUNCH. With APTitude The new SAICA APC has replaced the PPE and the Part II Finance Examination The APC is an assessment of professional.
Developing Business Practice – 302LON Preparing for a Successful Work Experience Unit: 9 Knowledgecast: 2.
Help or Hindrance: A Blended Approach to Learner Engagement A presentation to The Ako Aotearoa ‘Research in Progress’ Colloquium.
Dr Fiona J. L. Handley Centre for Learning and Teaching.
Victoria Crisp & Christine Ward The PePCAA Project: Formative Scenario-based CAA in Psychology for Teachers.
Blackboard Learn Assessment and Feedback Ulster E-Learning Conference 20 th January 2011 Alan Masson & Fiona McCloy Access and Distributed Learning, University.
Research Proposal The Alignment between Design, Implementation and Affordances, in Blended and Distance Learning.
Transforming lives through learning Access to Education Fund – Phase 2 Maggie Fallon – SEO Inclusion (ES) Amy Finlay – Policy.
ITIL Intermediate Lifecycle program- Service Transition ST 1 Why bother paying extra money for all the bells and whistles that you're never going to use.
ON LINE TOPIC Assessment.  Educational assessment is the process of documenting, usually in measurable terms, knowledge, skills, attitudes and beliefs.
Investigate Plan Design Create Evaluate (Test it to objective evaluation at each stage of the design cycle) state – describe - explain the problem some.
Lisa Gray and Paul Bailey Technology to support 21 st century tutoring.
The Blended Learning Project. Session Objective  Introduce the Blended Learning Project  Explore and experience SOLA packs that have already been created.
ITIL Intermediate Service Operation SO Complete Examination Package 1 Get Everything you need to pass your Service Operation Exam
© FSAI FSAI Advice-Line Evaluation Survey of Advice-line and Query Users and Mystery Shopper Measurement Evaluation carried out by by Insight Statistical.
Course Work 2: Critical Reflection GERALDINE DORAN B
Gabrielle Wong HKUST Library
Introducing the live webchat toolkit
Should you ever judge a book by its cover?
Welcome to Scottish Improvement Skills
SLP Training Day 3 30th September 2016
Personal Learning Planning Learning Logs and Pupil Achievement Folders
Learning Into Practice Plan
Artists in Communities and Schools Arts After Hours
Module Title Module Subtitle
FORGE AHEAD Program Transformation of Indigenous Primary Healthcare Delivery : Community-driven Innovations and Strategic Scale-up Toolkits Module.
Learning Analytics How can I identify and help my struggling students sooner rather than later? How can I see which concepts students struggle with in.
Evaluating the Impact of Careers Fairs
Links in the Chain: turning learning analytics data into actions
Staff and student experience of flipped teaching
Assessment brief Post graduate route.
Research, Reasoning and Rhetoric: Thinking with History: Lecture 6 Understanding marks and feedback 7: Understanding marks and feedback Ted Vallance.
Customer experience loop
SSP4000 Introduction to the Research Process Wk9: Introduction to qualitative research, Part 2 The focus of week 9 is to introduce students to the characteristics.
B.A. 4 Placement Overview (Placement 1) 4th October 2016
PowerSchool for Parents
Tips for tenderers Liz Frizi: Head of Procurement
Numeracy Ninjas Implementation Package
Can a computer-marked exam improve retention?
Sia Gravani 10th May th ICTMC & 38th SCT, Liverpool
Learning Analytics 13/11/2018 Making your business case.
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
UQ Course Site Design Guidelines
Preparing Our Students for the Future
The game MathScrabble. The game MathScrabble.
Instructional Learning Cycle:
Helen Jefferis, Soraya Kouadri & Elaine Thomas
Dr Claire Kotecki (STEM) & Dr Prithvi Shrestha (WELS)
CFP Board mentor Program: mentee Kit
Christine Gardner, Allan Jones, David Chapman, Helen Jefferis
What do we know (page 1)? Define the word "Taxonomy." (Knowledge)
Analytics for tracking student engagement
Getting Practical Science transition project
S4 Curriculum Evening MCHS.
Design Brief.
New employee induction for new staff and managers
Build it and They Will Come
Investigation of student engagement with programming in TU100 The impact of using a graphical programming environment? Helen Jefferis, Soraya Kouadri.
Blended learning in the workplace Lynette Lall (Jisc)
An overview of course assessment
Developing SMART Professional Development Plans
Customer experience loop
Social Science Curriculum Roll Out
General Studies ePortfolio Pilot
NHS DUDLEY CCG Latest survey results August 2018 publication.
CFP Board mentor Program: mentee Kit
CIHE Annual Conference
A note about this presentation
Presentation transcript:

Christine Gardner, Allan Jones, David Chapman, Helen Jefferis Analytics for tracking student engagement (TM355, Communications Technology) Christine Gardner, Allan Jones, David Chapman, Helen Jefferis Give a short overview of who we are, and some TM355 detail e.g. L3, C&C, mainly print-based module. This study will explore the use of specific technology enhanced learning and teaching (TELT) resources on module TM355, using a data analysis tool, A4A, developed by the Open University. State clearly and concisely what we hope participants to achieve; there was a problem, and then a reflection as to why it happened and we're on the path to a solution - state this clearly and concisely to help attract people to the session and give them a sense of why they should attend. This is a 'work-in-progress report' , more than a 'short presentation'.

Two aspects to the project Analytics : use of teaching tools on the TM355 VLE (2016/17 and 2017/18 cohorts):   When do the students engage with the technology enhanced learning and teaching (TELT)‘ tools and is this at predicted times during the module? Do students revisit the TELT tools? Has the level of student engagement with the tools changed over two presentations (16/17 and 17/18)? Interviews: student feedback on motivations and reasons for using TELT tools (2017/18 cohort): What motivates OU students to engage with TELT tools? Do students understand a topic more deeply as a result of using TELT tools? Are students deterred if the packages are too complicated/time consuming? The research questions cover two key areas; the effectiveness of the analytics tools and students’ perception of the TELT resources. By evaluating a specific Level 3 module in the School of Computing and Communications it is hoped that analytics can used to best effect, informing module teams and thus helping students achieve their maximum potential. Interviews will also be needed for the finer detail

Analytics for Action, A4A A4A can provide detail of how students are engaging with specific online materials. A4A six phases: Give an outline of A4A, what it can/can’t do. At the moment we are reviewing data, need more detail on individual performance. Then we can investigate issues etc. The university is currently proposing an increased use of analytics to support students. For example, SFT proposes that students “learn in state-of-the-art digital environments, using data analytics to understand and drive their learning. Through using digital tools, learners will have the opportunity to develop digital competencies, aligned with our commitment to digital inclusion…” However, it does not automatically identify activity at an individual student level.

Analytics for Action, A4A – detail DCT – Discrete Cosine Transform Hamming codes Demonstrate the different kinds of patterns of TELT tools use that can be seen via A4A: DCT - peak at TMA time (blue bar), 80 Hamming – use across Block 2, peaks at about 25 Neither have a big peak at revision time. Could mention student exam performance. Need to know more detail about student use of the TELT tools e.g. do students use the tools more than once?

Key hypothesis A key hypothesis is that those who engage with the TELT tools tend to perform better at exam time. Much time and resource had been invested in producing the TELT tools to supplement the printed materials, so there is a question on why they are not being used more extensively.

In particular… The prompt for this query was students’ relatively poor performance on a particular exam question on the 2016/17 presentation of TM355, based on error control codes. TM355 exam scores for 2016/17, Question 10, linked to a specific TELT tool on error control codes. Using A4A it could be seen that the associated TELT tool had not been extensively used, either during the module or for revision. Question 10 was not a popular choice with students.

More detail is needed: Error control codes Steady use across Block 2. Small peak at revision

Phase 1: Analytics data, July 2017 to July 2018 Collection of analytics data (A4A) on TELT tool use during the 2016/17 module presentation Identification of a sub-set of students for further research from exam data Consultation with TEL team to interrogate the analytics data more deeply Design of interview questions for 2017/18 cohort and submission of SRPP application Analysis of analytics data ( July 2017 to May 2018).

Initial data from 2016/17 exam TM355 exam scores for 2016/17, Q10 – small sample of 48 students (all who attempted Q10): Average exam score overall for all students – 45% Not used error control codes TELT tool at all – 30% Used error control codes TELT during the module, at least once – 53% Used error control codes TELT specifically at revision May/June – 52% Used error control codes TELT on more than one specific date i.e. returned to  package – 58 % This data tells us who and when visited this page: (Online activity 1.5 Error-control codes)  here, and how many times during the day.

Predicted results 48 students were in the sample, answering Q10, prediction for passing module 0.87 271 students did not answer Q10, prediction for passing module 0.85 Our small sample is typical across the cohort, no reason to perform poorly based on track record.

Phase 2: Interview data, TM355 2017/18 presentation 6 (2017/18) students volunteered for interview Interviews took place July 2018 Interview data analysed (June -Oct 2018) 6 students volunteered but unfortunately only 4 were actually available We had intended to use the SfB recording facility however we found that this only works when phoning internal numbers – ie worked between Chris and I but not as soon as I actually tried to use it. In the end we had 3 live interviews and 1 who emailed his answers to us – he have moved back to Greece and Sf B wouldn’t allow me to dial him. Overall a useful ‘pilot’

Interview comments: Seeing the coding in practice and having an interaction helped. Good for self-testing, noted by several students Why wouldn’t students use the resources? Just a video clip and didn’t add anything to existing knowledge Even though not many interviewees those that were interviewed gave some full and useful answers and in-sights into how they used the activities. The main point was ‘why wouldn’t students use these resources?’ – although one did admit that he hadn’t used them all! Some were less well received eg if they were just a video and not actually interactive – and some good suggestions – next slide

Ideas to progress: Give some indication of time needed for the activities (although obviously this will vary for each student) Add descriptions about what kind of activity. Promoting them in a new module introductory/revision video or podcast. Use the module forums to promote them Have ‘talking heads’ of students saying how useful they were. So more information on what the activities involved and more promotion to raise awareness

Phase 3: Action/Dissemination Produce descriptions/timings for activities Add advice in a revision podcast Interview the 18J cohort of students Review use of TELT tools for the 18J presentation Consider additional guidance for the mid-life review of the module. technology enhanced learning and teaching (TELT)

Questions? Contact: Christine Gardner - c.f.gardner@open.ac.uk Allan Jones – allan.jones@open.ac.uk David Chapman - david.chapman@open.ac.uk Helen Jefferis – h.jefferis@open.ac.uk