Presentation is loading. Please wait.

Presentation is loading. Please wait.

It’s a bit more complicated than that…

Similar presentations


Presentation on theme: "It’s a bit more complicated than that…"— Presentation transcript:

1 It’s a bit more complicated than that…
STEM OU Analyse evaluation: some initial findings

2 “Why ask taxi drivers about the driverless car?”
Rationale Background “Why ask taxi drivers about the driverless car?” “ALs are the problem”

3 Q. Do(es) learning analytics help students to complete modules?
Rationale Background Q. Do(es) learning analytics help students to complete modules? A. No. At least, not until Associate Lecturers, Student Support Teams, Module Teams etc. do something with them (it).

4 Anecdotal feedback from ALs was not positive
Rationale Background OU Analyse Dashboard has been made available by KMI to ALs to pilot since 15J. Uptake has been low. Anecdotal feedback from ALs was not positive Limited evidence of effectiveness (Scholarly insight, Autumn 2017: a data wrangler perspective) Why?

5 We do have a retention problem
A reminder…. We do have a retention problem The STEM pass rates have remained fairly static and are now -1.8% below the average OU pass rates for 2016/17. -2.9% -1.8% -3.9% * Please note change in axis 2014/15 2015/16 2016/17 FASS FBL STEM WELS OU Source: SIO Webstats website. Definition, Students registered at first fee liability point of a level 1, 2 or 3 UG module and passed, including retakes

6 What works? For whom? In what context? Why?
Evaluation strategy Realist evaluation What works? For whom? In what context? Why? Social interventions release (or inhibit) underlying mechanisms leading to patterns of (un)intended outcomes Context is critical Mechanisms ≠ activities Methodically neutral Drawing on realist evaluation tradition of Pawson & Tilley (1997) in turn, drawing on the work of Roy Bhaskar and of Margaret Archer (This study also draws on the tradition of ‘social informatics’ as the study of ICT in organisational & social settings)

7 First, some terminology
What is/are learning analytics? “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (Siemens, 2011). Predictive learning analytics "a variety of statistical and analytical techniques to develop models that predict future events or behaviours" (Nyce & CPCU, 2007) OU Analyse “The OU Analyse project is piloting new machine learning based methods for early identification of students who are at risk of failing..” (OU, 2017) OU have plenty data sources & tools – SIO, SAS-VA, Student Support Tool Analytics but not predictive OU Analyse Dashboard - Some non-predictive (VLE engagement) and some predictive date We need an unambiguous vocabulary if we are to understand the value of the different tools and sources

8 So what are module teams doing?
Context: Module teams So what are module teams doing? Logic models Represent the ‘theory’ behind module teams’ plans/expectations for OUA Diverse approaches – more or less explicit about expectations of Als Difficulties in interpreting predictive data Module Teams not directly engaged by the OUA Team Expectation that VLE engagement data will be more useful than predictive data Concerns about (lack of) usefulness of predictive data on first presentation, confounding opportunity to embed OUA in AL practice (In realism, ideas are real, because they have an effect in the world)

9 Identify Diagnose Intervene Preliminary study
What do our ALs do now to support students at risk? A model 3 stages Identification through e.g. student list data, participation in online activities, attendance at tutorials etc. Diagnosis relies on ALs (or student self-diagnosis) Interventions rely on ALs (121 contact, special sessions, extensions etc) SSTs (direct contact, MILLs etc) Currently, OUA aims to address identification Preliminary study Interviews with 7 ALs (not necessarily OUA uses) Intervene Diagnose Identify Personal communication, VLE use,. SST etc Health, module content, time management, work commitment etc… Phone call, special session, extension, etc

10 Some headline issues – what do ALs do?
Student list What ALs do Get data ALs get data from multiple sources in multiple formats including: Student list; Student contact history; Personal communications with students; TMA extensions Student engagement with module activities Collate information Typically as spreadsheet or paper record Communicate /text/phone Group eSRF “It seems to me splendidly absurd, to put it mildly, that we have a system that says if a student isn’t contacting you in response to s or texts, what the system then does is to send them more of the same” Contact history VLE Forums, wikis, OpenStudio postings etc AL Own records Student Student Student Student , phone, text etc

11 Some implications for the main study
How do ALs use the OUA Dashboard to support ‘at risk’ students? In what context? Why? Some implications/findings ALs spend a lot of time getting, collating and using information using tools that don’t fit together (and which possibly can’t be made to) New tools and information sources seem as likely to add to this complexity, rather than reduce it Some additional flags for the main study How are the diverse approaches to OUA taken by pilot Module Teams received by ALs? When ALs report (positively or negatively) on the value of Dashboard, are they referring to current VLE engagement data or predictive data? Trust: how do ALs interpret the meaning of the Dashboard? How has the OUA Dashboard been introduced to an AL?

12 ?

13 Theories of technology adoption and use
Holden & Karsh, 2010, 161


Download ppt "It’s a bit more complicated than that…"

Similar presentations


Ads by Google