Download presentation
Presentation is loading. Please wait.
Published byLoraine Beasley Modified over 5 years ago
1
Christine Gardner, Allan Jones, David Chapman, Helen Jefferis
Analytics for tracking student engagement (TM355, Communications Technology) Christine Gardner, Allan Jones, David Chapman, Helen Jefferis Give a short overview of who we are, and some TM355 detail e.g. L3, C&C, mainly print-based module. This study will explore the use of specific technology enhanced learning and teaching (TELT) resources on module TM355, using a data analysis tool, A4A, developed by the Open University. State clearly and concisely what we hope participants to achieve; there was a problem, and then a reflection as to why it happened and we're on the path to a solution - state this clearly and concisely to help attract people to the session and give them a sense of why they should attend. This is a 'work-in-progress report' , more than a 'short presentation'.
2
Two aspects to the project
Analytics : use of teaching tools on the TM355 VLE (2016/17 and 2017/18 cohorts): When do the students engage with the technology enhanced learning and teaching (TELT)‘ tools and is this at predicted times during the module? Do students revisit the TELT tools? Has the level of student engagement with the tools changed over two presentations (16/17 and 17/18)? Interviews: student feedback on motivations and reasons for using TELT tools (2017/18 cohort): What motivates OU students to engage with TELT tools? Do students understand a topic more deeply as a result of using TELT tools? Are students deterred if the packages are too complicated/time consuming? The research questions cover two key areas; the effectiveness of the analytics tools and students’ perception of the TELT resources. By evaluating a specific Level 3 module in the School of Computing and Communications it is hoped that analytics can used to best effect, informing module teams and thus helping students achieve their maximum potential. Interviews will also be needed for the finer detail
3
Analytics for Action, A4A
A4A can provide detail of how students are engaging with specific online materials. A4A six phases: Give an outline of A4A, what it can/can’t do. At the moment we are reviewing data, need more detail on individual performance. Then we can investigate issues etc. The university is currently proposing an increased use of analytics to support students. For example, SFT proposes that students “learn in state-of-the-art digital environments, using data analytics to understand and drive their learning. Through using digital tools, learners will have the opportunity to develop digital competencies, aligned with our commitment to digital inclusion…” However, it does not automatically identify activity at an individual student level.
4
Analytics for Action, A4A – detail
DCT – Discrete Cosine Transform Hamming codes Demonstrate the different kinds of patterns of TELT tools use that can be seen via A4A: DCT - peak at TMA time (blue bar), 80 Hamming – use across Block 2, peaks at about 25 Neither have a big peak at revision time. Could mention student exam performance. Need to know more detail about student use of the TELT tools e.g. do students use the tools more than once?
5
Key hypothesis A key hypothesis is that those who engage with the TELT tools tend to perform better at exam time. Much time and resource had been invested in producing the TELT tools to supplement the printed materials, so there is a question on why they are not being used more extensively.
6
In particular… The prompt for this query was students’ relatively poor performance on a particular exam question on the 2016/17 presentation of TM355, based on error control codes. TM355 exam scores for 2016/17, Question 10, linked to a specific TELT tool on error control codes. Using A4A it could be seen that the associated TELT tool had not been extensively used, either during the module or for revision. Question 10 was not a popular choice with students.
7
More detail is needed: Error control codes Steady use across Block 2.
Small peak at revision
8
Phase 1: Analytics data, July 2017 to July 2018
Collection of analytics data (A4A) on TELT tool use during the 2016/17 module presentation Identification of a sub-set of students for further research from exam data Consultation with TEL team to interrogate the analytics data more deeply Design of interview questions for 2017/18 cohort and submission of SRPP application Analysis of analytics data ( July 2017 to May 2018).
9
Initial data from 2016/17 exam
TM355 exam scores for 2016/17, Q10 – small sample of 48 students (all who attempted Q10): Average exam score overall for all students – 45% Not used error control codes TELT tool at all – 30% Used error control codes TELT during the module, at least once – 53% Used error control codes TELT specifically at revision May/June – 52% Used error control codes TELT on more than one specific date i.e. returned to package – 58 % This data tells us who and when visited this page: (Online activity 1.5 Error-control codes) here, and how many times during the day.
10
Predicted results 48 students were in the sample, answering Q10, prediction for passing module 0.87 271 students did not answer Q10, prediction for passing module 0.85 Our small sample is typical across the cohort, no reason to perform poorly based on track record.
11
Phase 2: Interview data, TM355 2017/18 presentation
6 (2017/18) students volunteered for interview Interviews took place July 2018 Interview data analysed (June -Oct 2018) 6 students volunteered but unfortunately only 4 were actually available We had intended to use the SfB recording facility however we found that this only works when phoning internal numbers – ie worked between Chris and I but not as soon as I actually tried to use it. In the end we had 3 live interviews and 1 who ed his answers to us – he have moved back to Greece and Sf B wouldn’t allow me to dial him. Overall a useful ‘pilot’
12
Interview comments: Seeing the coding in practice and having an interaction helped. Good for self-testing, noted by several students Why wouldn’t students use the resources? Just a video clip and didn’t add anything to existing knowledge Even though not many interviewees those that were interviewed gave some full and useful answers and in-sights into how they used the activities. The main point was ‘why wouldn’t students use these resources?’ – although one did admit that he hadn’t used them all! Some were less well received eg if they were just a video and not actually interactive – and some good suggestions – next slide
13
Ideas to progress: Give some indication of time needed for the activities (although obviously this will vary for each student) Add descriptions about what kind of activity. Promoting them in a new module introductory/revision video or podcast. Use the module forums to promote them Have ‘talking heads’ of students saying how useful they were. So more information on what the activities involved and more promotion to raise awareness
14
Phase 3: Action/Dissemination
Produce descriptions/timings for activities Add advice in a revision podcast Interview the 18J cohort of students Review use of TELT tools for the 18J presentation Consider additional guidance for the mid-life review of the module. technology enhanced learning and teaching (TELT)
15
Questions? Contact: Christine Gardner - c.f.gardner@open.ac.uk
Allan Jones – David Chapman - Helen Jefferis –
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.