Extracting useful information from the UK’s National Student (Satisfaction) Survey Mark Langan, Alan Fielding and Peter Dunleavy Manchester Metropolitan University Faculty of Science & Engineering Higher Education Academy Annual Conference 2010
What makes a student satisfied? Higher Education Academy Annual Conference 2010
Structure Higher Education Academy Annual Conference 2010 Research evidence based on a range of quantitative approaches using national dataset for science subjects Discussion about implications of using NSS for decision-making in H.E.
Compulsory process in UK (since 2005/6), conducted by Ipsos MORI on behalf of HEFCE. Uses a standard basic survey to monitor perceptions of final year students. Approach based upon an Australian survey called the Course Experience Questionnaire (CEQ; Ramsden 1991). Considered robust in terms of three statistical measures; internal consistency, construct validity and concurrent validity. Measures six dimensions: teaching; assessment and feedback (sometimes considered separately); academic support; organisation and management; resources; and, personal development. Higher Education Academy Annual Conference 2010
The NSS uses a 5 point scale, completed in the final UG year (usually online) survey. In addition to 21 ‘items’ there is a separate overall satisfaction rating (Q22). Thorough overview can be found in Surridge 2007 and Marsh and Cheng Take home message: the outputs are hierarchical in nature and not designed for simplistic league tables. Note: satisfaction is a complex concept to measure and there are many approaches. Higher Education Academy Annual Conference 2010
With particular reference to… consistency of patterns between years differences between subjects factors associated with overall student satisfaction. Higher Education Academy Annual Conference 2010
Data Level 3 (closest to Programme/Dept)NSS data from a)2007; b)2008; c)2009. Pruned to remove subjects not taught at MMU (e.g. medicine). Still very large data sets (>40,000 cases per survey) Higher Education Academy Annual Conference 2010
NSS Questions Teaching (Teach) Q1 Staff are good at explaining things. Q2 Staff have made the subject interesting. Q3 Staff are enthusiastic about what they are teaching. Q4 The course is intellectually stimulating. Assessment fairness (Fairness) Q5 The criteria used in marking have been clear in advance. Q6 Assessment arrangements and marking have been fair. Assessment feedback (Feedback) Q7 Feedback on my work has been prompt. Q8 I have received detailed comments on my work. Q9 Feedback has helped me clarify things I did not understand. Higher Education Academy Annual Conference 2010
NSS Questions Support Q10 I have received sufficient advice and support with my studies. Q11 I have been able to contact staff when I needed to. Q12 Good advice was available when I needed to make study choices. Organisation (Org) Q13 The timetable works efficiently as far as my activities are concerned. Q14 Any changes in the course or teaching have been communicated effectively. Q15 The course is well organised and is running smoothly.
NSS Questions Learning Resources (Resources) Q16 The library resources and services are good enough for my needs. Q17 I have been able to access general IT resources when I needed to. Q18 I have been able to access specialised equipment, facilities or room when I needed to. Personal Development (PD) Q19 The course has helped me present myself with confidence. Q20 My communication skills have improved. Q21 As a result of the course, I feel confident in tackling unfamiliar problems. Overall satisfaction (Overall) Q22 Overall, I am satisfied with the quality of the course. Higher Education Academy Annual Conference 2010
Which of the areas surveyed do you think correlate with the Q22 overall satisfaction score? What do you think student perceptions of the questions are (i.e. what is going through their mind when they complete the questionnaire)?
Satisfaction Higher Education Academy Annual Conference 2010 Satisfaction is % of students answering 4 or 5 to a question. e.g. Q 1 Biology MMU – 95% of students were satisfied
There are subject differences Subject differences confound simple comparisons, examples from Medians for Qs 7, 8 & 9 plus 13, 14 & 15. Higher Education Academy Annual Conference 2010
Biology results (2008)
Higher Education Academy Annual Conference 2010
What answers are correlated with Q22? Approach Use % in agreement with a question (answers 4 & 5 on 5 point scale). Simple correlation (ignoring subject) Correlation allowing for subject differences (ANCOVA) Repeat for each year. Calculate nationally and within MMU Higher Education Academy Annual Conference 2010
Annual national trends Overall satisfaction is consistently related to: Teaching Quality, Support and Organisation. It only weakly related to Resources and Assessment, particularly feedback. Higher Education Academy Annual Conference 2010
Subject differences (Feedback Qs) Higher Education Academy Annual Conference 2010 Question Subjectpromptdetailedexplained Biological Sciencesr Physical Sciencesr*0.440*0.385*0.589 Physical Geographyr*0.675*0.377*0.566 Mathematical Sciencesr0.328*0.460*0.533 Computer Sciencesr *0.353 Mechanically based Engineeringr Electrical and Electronic Engineeringr Technologyr* Human Geographyr *0.433
Predictive model (Forest Tree Analysis) Higher Education Academy Annual Conference 2010 Cluster analyses are unsupervised methods that take no account of pre-assigned class labels or values. Decision and regression trees use a supervised learning algorithm which must be provided with a training set that contains cases with class labels or values. We used a new variant of regression trees called ‘RandomForests’. Robust method with fewer constraints than traditional regression methods, for example allowing different factors to be explored in their influence on overall satisfaction within different subgroups.
Regression Trees (an example) based on Predicts property value
Effectiveness of Q1-21 to predict overall satisfaction (Q22) Higher Education Academy Annual Conference 2010 Predicting questionnaire itemInc MSE (%) Q15 - The course is well organised and is running smoothly Q1 - Staff are good at explaining things71.45 Q4 - The course is intellectually stimulating66.71 Q14 - Any changes in the course or teaching have been communicated effectively60.79 Q10 - I have received sufficient advice and support with my studies55.34 Q11 - I have been able to contact staff when I needed to43.40 Q3 - Staff are enthusiastic about what they are teaching40.08 Q2 - Staff have made the subject interesting38.26 Q12 - Good advice was available when I needed to make study choices35.27 Subject32.35 Q6 - Assessment arrangements and marking have been fair20.10 Q17 - I have been able to access general IT resources when I needed to18.73 Q19 - The course has helped me present myself with confidence17.35 Q18 - I have been able to access specialised equipment, facilities or room when I15.41 Q16 - The library resources and services are good enough for my needs15.34 Q20 - My communication skills have improved13.29 Q13 - The timetable works efficiently as far as my activities are concerned13.16 Q7 - Feedback on my work has been prompt10.49 Q9 - Feedback on my work has helped me clarify things I did not understand6.65 Q5 - The criteria used in marking have been clear in advance6.60 Q21 - As a result of the course, I feel confident in tackling unfamiliar problems3.32 Q8 - I have received detailed comments on my work3.04
Predictive model (Forest Tree Analysis) Higher Education Academy Annual Conference 2010 Predictor2007 (%) 2008 (%) 2009 (%) Teaching Fairness Feedback Assessment Support Organisation Resources Personal Development
Q22 ‘under-performers’ Higher Education Academy Annual Conference 2010 Actual Predicted Residual SE1 SE2 SE3 Subjects
Q22 ‘as expected from Q1-Q21’ Higher Education Academy Annual Conference 2010
Q22 ‘over-performers’ Higher Education Academy Annual Conference 2010
University Groupings
GroupnMech Eng Com Sci Allied Med Elec Eng BiolM&SEGSHum Geog ChemAll Million Alliance None Russell All n = Mean overall Q22 for university groups
Conclusions Higher Education Academy Annual Conference 2010 Subject differences (e.g. mathematical content) Institutional differences False assumptions (e.g. enhancing feedback directly enhances Q22) Institutional effects Satisfaction is a complex measure related to L&T practices
Higher Education Academy Annual Conference Q15 The course is well organised and is running smoothly. 2.Q4 The course is intellectually stimulating. 3.Q1 Staff are good at explaining things. 4.Q21 As a result of the course, I feel confident in tackling unfamiliar problems. 5.Q10 I have received sufficient advice and support with my studies Top five predictors (best first)
Quotes from Ramsden (2007) Higher Education Academy Annual Conference 2010 “… [The NSS] is not a measure of satisfaction so much as a window into how our designs for learning are experienced by students. From these insights we assemble the practical measures we may take to enhance the quality of their experiences.”
Quotes from Ramsden (2007) Higher Education Academy Annual Conference 2010 “... it is not simple to know what to do. Current experiences, unlike satisfaction, are a mixture of previous experiences and the environment as it is now so sometimes we will need to adjust expectations or consider altering previous experiences in order to improve quality.”
Quotes from Ramsden (2007) Higher Education Academy Annual Conference 2010 “I cannot agree with the idea, for example, that because students are slightly less positive about feedback on assessed work in the NSS than about the quality of teaching we should rush to bully academics into providing more feedback more quickly.”
Quotes from Ramsden (2007) Higher Education Academy Annual Conference 2010 “From this it also follows that students do not have a ‘right’ to be satisfied. They are themselves part of the experience Students decide their own destinies and we can only add or subtract value at the margins.”
Higher Education Academy Annual Conference 2010 Does anyone have an example of direct change as a result of NSS surveys? How can we use NSS ratings to enhance our practices?