Presentation is loading. Please wait.

Presentation is loading. Please wait.

Christine Gardner, Allan Jones, David Chapman, Helen Jefferis

Similar presentations


Presentation on theme: "Christine Gardner, Allan Jones, David Chapman, Helen Jefferis"— Presentation transcript:

1 Christine Gardner, Allan Jones, David Chapman, Helen Jefferis
Analytics for tracking student engagement (TM355, Communications Technology) Christine Gardner, Allan Jones, David Chapman, Helen Jefferis Give a short overview of who we are, and some TM355 detail e.g. L3, C&C, mainly print-based module. This study will explore the use of specific computer aided learning (CAL) resources on module TM355, using a data analysis tool, A4A, developed by the Open University.

2 Two aspects to the project
Analytics : use of teaching tools on the TM355 VLE (16J and 17J): When do the students engage with the Computer Aided Learning (CAL) tools and is this at predicted times during the module? Do students revisit the CAL tools? Has the level of student engagement with the tools changed over two presentations (16J/17J)? Student feedback via interview Student interviews on motivations and reasons for using VLE tools (17J presentation): What motivates OU students to engage with CAL tools? Do students understand a topic more deeply as a result of using CAL tools? Are students deterred if the packages are too complicated/time consuming? The research questions cover two key areas; the effectiveness of the analytics tools and students’ perception of the CAL resources. By evaluating a specific Level 3 module in the School of Computing and Communications it is hoped that analytics can used to best effect, informing module teams and thus helping students achieve their maximum potential. Interviews will also be needed for the finer detail

3 Analytics for Action, A4A – overview
A4A can provide detail of how students are engaging with specific online materials. A4A six phases: Give an outline of A4A, what it can/can’t do. At the moment we are reviewing data, need more detail on individual performance. Then we can investigate issues etc. The university is currently proposing an increased use of analytics to support students. For example, SFT proposes that students “learn in state-of-the-art digital environments, using data analytics to understand and drive their learning. Through using digital tools, learners will have the opportunity to develop digital competencies, aligned with our commitment to digital inclusion…” However, it does not automatically identify activity at an individual student level.

4 DCT – Discrete Cosine Transform
Hamming codes Demonstrate the different kinds of patterns of CAL tools use that can be seen via A4A: DCT - peak at TMA time (blue bar), 80 Hamming – use across Block 2, peaks at about 25 Neither have a big peak at revision time. Could mention student exam performance. Need to know more detail about student use of the CAL tools e.g. do students use the tools more than once?

5 A key hypothesis is that those who engage with the CAL tools tend to perform better at exam time.
Much time and resource had been invested in producing the CAL tools to supplement the printed materials, so there is a question on why they are not being used more extensively.

6 In particular… The prompt for this query was students’ relatively poor performance on a particular exam question on the 2016J presentation of TM355, based on error control codes. TM355 exam scores for 16J, Question 10, linked to a specific CAL tool on error control codes. Using A4A it could be seen that the associated CAL tool had not been extensively used, either during the module or for revision. Question 10 was not a popular choice with students.

7 More detail is needed… Error control codes Steady use across Block 2.
Small peak at revision

8 Initial data from 16J exam
TM355 exam scores for 16J, Q10 – small sample of 48 students (all who attempted Q10): Average exam score overall for all students – 45.49% Not used error control codes CAL tool at all – 30.39% Used error control codes CAL during the module, at least once – 52.67% Used error control codes CAL specifically at revision May/June – 51.90% Used error control codes CAL on more than one specific date i.e. returned to  package – % This data tell us who and when visited this page: (Online activity 1.5 Error-control codes)  here, and how many times during the day.

9 Phase 1: Analytics data, July 2017 to July 2018
Collect analytics data (A4A) on VLE tool use during the 16J module presentation Identify a sub-set of 16J students for further research from 16J exam data Consult with TEL team to interrogate the analytics data more deeply Design interview questions for 17J cohort and submit SRPP application Analysis of analytics data ( July 2017 to May 2018).

10 Phase 2: Interview data, TM355 17J presentation
Recruit 17J students (145 students in sample, invitations sent) Interview 17J students (June/July 2018) Analyse 17J interview data (June -Oct 2018)

11 Phase 3: Dissemination and write-up milestones
eSTEeM conference (April Interim report on findings, mainly 16J), reviewing both the module performance aspects and the analytics tool used. Final eSTEeM report (December 2018, 16J/17J) Write an associated journal paper (Feb-March) eSTEeM conference (April 2019, full report)

12 Questions? Contact: Christine Gardner - c.f.gardner@open.ac.uk
Allan Jones – David Chapman - Helen Jefferis –


Download ppt "Christine Gardner, Allan Jones, David Chapman, Helen Jefferis"

Similar presentations


Ads by Google