Utilising Module Evaluation data to explore outcomes from the Teaching Excellence and Student Outcomes Framework (TEF) Subject Level Pilot. Natalie Holland.

Slides:



Advertisements
Similar presentations
The flexibility is one of the most important aspects. Students can work at their own pace, receive immediate feedback and get a nationally recognised.
Advertisements

Exploring the Psychological Contracts of first year students and associated links to retention PLAT 2010.
Assessment and Learning in Practice Settings (ALPS) © Student’s perception of competence to practice Dr J D Cortis Assessment.
ESCalate seminar, Swansea Metropolitan University 3 June 2008 Digital tools for blended learning
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
The Student Experience Project Overview for Kosovo Higher Education visit Mark Wilkinson October 2014.
Qatar University Exemplary Online Course Award
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Key features of the University of Manchester Professor Cathy Cassell Deputy Director (Academic) Sarah Featherstone Head of Undergraduate Services Original.
The National Student Survey (NSS) Penny Jones, Strategic Planning Office Tracy Goslar and Miles Willey, Academic Standards & Partnership Wednesday 16 March.
Providing mentor support for practice educators in training Exploring and evaluating approaches used by Bournemouth University 2010.
Module, Course and Unit Evaluations Module, course or unit evaluations give you the opportunity to make your voice heard by giving feedback about your.
Students as Change Agents Exploring issues of Student Engagement among On- Campus MSc Students Denise Ryder, Jonathan Doney, Nii Tackie-Yaoboi With Nadine.
Planning for NSS 2013 School of Medicine Manjit Bansal.
Nicki Horseman Lead HE analyst Times Higher Education.
Articulating from FE to HE: Assessing & Improving Academic Confidence Enhancement Themes conference, Thursday 9 June 2016 John McIntyre Conference Centre,
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
One Methodology, Many Services
Changing Landscapes - Observations on The Teaching Excellence Framework (TEF), Assessment & Feedback Practices and Professionalism.
FROM BLENDED LEARNING TO BLENDED SURVEYING
Michael Allardice and Tom Cunningham
Can a grade based approach improve our assessment practice?
Effective Use of Data A reflective and stratified approach to enhancing the quality of the student experience Cathy Milligan, Deputy Director (Education.
Teaching Excellence Framework Year Two
Welcome to the University of Wolverhampton
Information, Information, Information The Review of NSS and Unistats
Leading Enhancement in Assessment and Feedback in Medical Sciences
Open journal systems and undergraduate research
Developing the reflective learner -
Prof Nora Colton Deputy Vice-Chancellor, Academic November 2016
Teaching Excellence Framework (TEF) Higher Education White Paper
New developments in the UK Higher Education
COIS40894 PROFESSIONAL AND ACADEMIC SKILLS FOR APPLIED IT I
Curriculum, Assessment, Data, Progress, Reporting and Tracking.
Governor’s meeting 11th July 2017
Partnership Forum 2017 Partner Institution Survey 2016 :
Dr. Peter Hills, Kara Peterson, Simon Croker, and Dr. Rachel Manning
A department-wide approach to Feedback
2017 National Survey of Student Engagement (NSSE)
Learning Gain: Evaluation, Evidence and Enhancement
External Examiners Induction Welcome to UEL
Susan Rhind, Neil Lent, Kirsty Hughes, Jill MacKay
Measures of Graduate Success
Using MOOCs for development of transversal skills
Delivering new Enhanced Assessments
Derek Herrmann & Ryan Smith University Assessment Services
Understanding the student journey – from pre-arrival to graduation
Dr Ben Brabon Academic Lead, Arts, Humanities & Social Sciences
Providing Customized Training on Quality Online Design and Delivery
Developing the Guided Learner Journey
Teaching Excellence Framework
Engage – Annual Learning and Teaching Conference Anglia Ruskin University Employability of Students’ and Graduates Dr Heike Behle, LEGACY, Warwick.
The Teaching Excellence Framework: what does it mean for UCL communicators? Jess Shepherd, Head of Communications in the office of the Vice-Provost for.
TEACHING EXCELLENCE FRAMEWORK (TEF) GOING FOR GOLD
Helen Jefferis, Soraya Kouadri & Elaine Thomas
Manageable amount of work from EES to fit around other commitments
The power of data: reports and dashboards that make an impact
Effective Use of Data A reflective and stratified approach to enhancing the quality of the student experience Cathy Milligan, Deputy Director (Education.
A department-wide approach to Feedback
SOLSTICE & CLT Conference 2013
Using MOOCs for development of transversal skills
Moving (positively) towards subject level TEF
Writing Subject Level TEF in Biomedical Sciences
External Examiners Induction Welcome to UEL
Workshop Set-Up: The aim is that at each table we have a variety of disciplines / subjects represented by (ideally) four participants. Ensure a mixture.
Dr Elena Zaitseva Teaching and Learning Academy
May 2013 KEY INFORMATION SETS THE BASICS.
Mark Hornshaw University of Notre Dame Australia School of Business
UCL experience of the TEF and the subject pilot
Year 11 & 12 Maths from a students’ viewpoint
Presentation transcript:

Utilising Module Evaluation data to explore outcomes from the Teaching Excellence and Student Outcomes Framework (TEF) Subject Level Pilot. Natalie Holland Teaching and Learning Academy Liverpool John Moores University

The problem TEF subject level pilot provides a lot of information at the subject level and across different groups of students. It can be used to pinpoint where further exploration is needed to understand why particular groups of students studying a particular subject area are reporting a different university experience compared with their peers. 5 TEF metrics focus on NSS satisfaction. NSS current datasets do not allow for the type of manipulation needed to explore this further. My question – Can Module Evaluation at LJMU offer an alternative source of data?

What is the TEF subject level pilot? 35 subject groups (LJMU has programmes within 26 subject groups) Each subject group given an initial rating based on its performance across a set of 9 metrics Institutions write a 5 page narrative providing a context, highlighting successful features of the course as well as any mitigating factors. Holistic assessment of the subject group by an independent panel

What is the TEF subject level pilot? Aspect of Quality Metric Metric type Flag Value Source Teaching quality The teaching on my course NSS-based 0.5 NSS Q1-4 Assessment and feedback NSS Q8-11 Student voice NSS Q23-25 Learning environment Academic support NSS Q12-14 Learning resources NSS Q18-20 Continuation 2 HESA and ILR data Student outcomes and learning gain Highly skilled employment or higher study* Employment 1 DLHE declared activity 6 months after graduation Sustained employment or further study LEO 3 years after qualification Above median earnings threshold or higher study

The NSS based metrics and Module Evaluation It is not currently possible to us NSS data to explore the differences in experience across groups at the subject group level. Our own Module Evaluation survey provides an alternative source of information on how our students are experiencing their studies. Q1_Staff are good at explaining things Q2_The module has challenged me to achieve my best work Q3_I have received helpful comments on my work Q4_Feedback on my work has been timely Q5_I have actively/fully engaged with this module Q6_Overall, I am satisfied with the quality of this module Q7_Please comment on the most interesting aspect of this module Q8_Please comment on how this module could be improved Ideal way to look into the differences in experience across groups at the subject level would be to explore the NSS survey data and comments. The way that NSS data is provided to institutions currently, doesn’t allow for this level of interrogation – there does seem to be some changes being made to this for this academic year so hopefully in the future this will be possible.

Methodology TEF looks at an aggregate of 3 years of NSS data – 2015/16, 2016/17 and 2017/18 Combined ME datasets from 2015/16, 2016/17 and 2017/18 Identified areas where split metrics indicated differentiated experiences across all subject groups. Explored the ME data set for closed questions Thematic analysis of open ME comments to look at the stories behind identified differences between groups.

Limitations Only have data on Age, Ethnic group, Origin, Gender Ethnicity is only split by BAME, White, Not known, Refused Questions in ME do not directly align / cover all NSS based aspects in the metrics – comments could provide more informative information.

Example – A humanities subject group This subject area performs well in NSS based metrics overall with double positive flags There is variation when split across different groups of students Mature and BAME students no flag compared with double positive flags Metric White BAME Young Mature The teaching on my course No Flag Assessment and feedback ++ Student voice Academic support + Learning resources  No Flag

Module Evaluation Closed Questions BAME Students have lower satisfaction scores than any other group in this subject group across all 6 questions. Particularly Q2 and Q5. Mature students actually have highest satisfaction levels across all questions.   All Subject group BAME White Mature Young Q1_Staff are good at explaining things 84% 79% 85% Q2_The module has challenged me to achieve my best work 73% 62% 75% 72% Q3_I have received helpful comments on my work 80% 71% 82% Q4_Feedback on my work has been timely 88% 86% Q5_I have actively/fully engaged with this module 76% 69% Q6_Overall, I am satisfied with the quality of this module 81% BAME = 69 White = 1451 Mature = 267 Young = 1253

Comments analysis Thematic analysis of comments I am looking at instances of comments – this means that every comment a student makes will be considered separately. Numbers of comments is not number of individual students.

Most interesting aspect of this module BAME 32 comments (46% of responses) White 824 comments (57% of Mature 65 comments (62% of Young 691 comments (55% of When looking at comments left to the question What was the most interesting aspect of the module, similar themes were identified for all groups of students: Interest in topics The level of challenge HOWEVER – BAME STUDENTS MENTION CHALLENGE IN A NEGATIVE WAY – “The module was certainly challenging as I had no idea what I was doing for the XXX essay”.

Most interesting aspect of this module White students saw the level of challenge as motivating: “I like the challenge that this module has given in terms of pushing me to achieve” “The challenge of finding your own sources for assignments, both interesting and challenging, very worthwhile!” Some BAME students viewed challenge negatively : “The module was certainly challenging as I had no idea what I was doing for the XXX essay”. “The challenging aspect would be understanding the assignments and completing them on time”.

How could this module be improved? Some consistent messages across all groups: More teaching time Some differences in opinion as to how this should look. BAME students would like more seminars rather than Lectures. Other groups were more mixed in opinion Support and greater guidance with assessments and exam preparation. Some consistent messages across all groups: More teaching time Some differences in opinion as to how this should look. BAME students would like more seminars rather than Lectures – other groups were more mixed in opinion Support and greater guidance with assessments and exam prep

BAME – How could this module be improved? More resources / support tools such as videos and online resources Teaching approach/ format (seminars preferred to lectures): “I struggled to learn through the methods [NAME REMOVED] wanted us to learn through. He is a good lecturer but his methods are not for all”. More support: “Reading was a bit overwhelming” “Hard to follow at times”

Mature Students – How could this module be improved? Main points were the same but there were some points more specific to mature students: Some found lecturers could be patronising and didn’t take into account the life experience that mature students have. Better co-ordination of deadlines: “More co-ordination amongst the lecturers on deadlines across the subject would have been helpful, as for much of the year there have been no deadlines and all of the coursework assignments have been scheduled for the same days in a two-three week period which has caused stress and anxiety”.

Next steps . . . Focus groups with BAME / mature students to understand how student experience can be improved. Feedback to Programme teams – develop action plans to ensure students from all backgrounds are supported Follow-up findings in NSS results in July