Download presentation
Presentation is loading. Please wait.
Published byBrendan Vaughan Modified over 9 years ago
1
Learning from the NSS: can’t get no satisfaction…… "This year's National Student Survey is a wake-up call to university vice-chancellors. They must buck up their ideas and do far more to improve the experience they offer students.” Aaron Porter NUS president Can’t get no satisfaction: Discrepancies between NSS qualitative and quantitative data. Implications for quality enhancement Dr Clare Milsom, Dr Martyn Stewart, Dr Elena Zaitseva Academic Enhancement Unit Liverpool JMU Surveys for Enhancement 2011
2
Purposes of the NSS 1.Quality assurance 2.Student choice 3.Improvement of the student learning experience (quality enhancement) “This year's National Student Survey is a wake-up call to university vice-chancellors. They must buck up their ideas and do far more to improve the experience they offer students.“ Aaron Porter July 2010
3
The structure of the NSS questionnaire is appropriate (dimensionality of the data) 5.Definitely agree 4.Mostly agree 3.Neither agree nor disagree 2.Mostly disagree 1.Definitely disagree
4
‘Factor analysis identified the six NSS factors that the NSS was designed to measure but it also considered that assessment and feedback should be considered as separate factors’ (Marsh and Cheng 2008:6) ‘broad evidence for the integrity of the six original scales, and there was some evidence for two additional dimensions relating to feedback and workload’ (Richardson et al. 2007: 578)
5
The NSS questionnaire shows satisfactory levels of internal consistency and validity 5.Definitely agree 4.Mostly agree 3.Neither agree nor disagree 2.Mostly disagree 1.Definitely disagree
6
‘..proved to be remarkably robust’ ‘satisfactory levels of internal consistency...construct validity...and concurrent validity’ (Richardson et al. 2007: 578) ‘our exploration of the NSS questionnaire indicated strong convergent validity..and discriminant validity’ (Fielding et al. 2010: 359) ‘Analysis of NSS data has repeatedly provided evidence the robustness of the instrument and conceptual validity of the dimensions of the student experience is assesses’ (HEFCE2010: 9)
7
NSS responses are able to differentiate between institutions 5.Definitely agree 4.Mostly agree 3.Neither agree nor disagree 2.Mostly disagree 1.Definitely disagree
8
‘..2.5% of the variation in NSS scores for Overall Satisfaction is due to differences between institutions’. (Surridge 2010: 25) ‘there is much more variation in the responses by students within each university than there is between the different universities’ (Marsh and Cheng 2008:52). ‘we recommend that NSS ratings should only be used with appropriate caution for comparing universities.’ (Cheng and Marsh 2010: 709) Assessment and Feedback More satisfiedLess satisfied > 25others InternationalEU Teaching and Learning More satisfiedLess satisfied White ethic bgAll minority ethic bg Williams and Kane (2008) and HEFCE (2010)
9
NSS responses are able to differentiate between subjects of study 5.Definitely agree 4.Mostly agree 3.Neither agree nor disagree 2.Mostly disagree 1.Definitely disagree
10
‘especially to comparisons of different courses—either different courses within the same university or the same course across universities.’ (Cheng and Marsh 2010: 709) ‘complex relationship between survey questions and subject studied was found in many areas’ (Fielding et al. 2010: 365) ’The students with the highest satisfaction score were those undertaking historical and philosophical studies or physical sciences.... Those studying creative arts and design gave the lowest score...’ (HEFEC 2010 Findings and Trends 2006-9) ‘because of the differences between subjects, it is not sensible to expect similar profiles of scores across all subjects.’ (Vaughan and Yorke 2009:32)
11
Commenting on the results from the National Student Survey 2010, Universities and Science Minister David Willetts said: ‘I want to ensure prospective students have comprehensive, up-to- date and comparable information about what is on offer.’ ‘It is desirable to make available clear guidance about the risks and issues associated with using NSS results for the purposes of comparison’ HEFCE July 2010 ‘Focus on the what steps need to be taken to move forward and improve rather than concentrating on over analysing the scores’ ‘Aspirational and evidence-based approaches.’ Flint et al. (2009)
12
‘Where a survey is intended to be used formatively..... enhancement activities.... an instrument of reasonable quality may be ‘good enough’ for the purpose’ ‘If the survey is summative in intention (NSS) its technical qualities become more important’ (Yorke 2010:734)
13
Teaching on my course % Assess and feedback % Academic support % Organisation and management % Learning resources % Personal development % Overall Satisfaction % 2010 Diff 092010 Diff 092010 Diff 092010 Diff 092010 Diff 092010 Diff 092010 Diff 09 80%-460%-471%-678%-280%-775%-682%-5 75%62%070%-474%-272%74%-379%-2 80%-365%-771%-550%-1473%-585%277%2 70%252%169%174%678%-373%476%6 79%065%-371%73%2 -373%-274%-3 76%-559%-467%663%-367%-778%-472%-6 Faculty LJMU NSS approach: Formative intention with summative action
14
Subject Reporting Group: content and discourse analysis Positive Frequency Positive Strength Negative Frequency Negative Strength Teaching Assessment Feedback Academic support Timetabling and deadlines Course organisation + facilities LRC + access to PCs PDP + skills Employability + placement ICT Accessibility issues Value for Money 2011 LJMU Academic Enhancement Unit Qualitative analysis of open response (free text) questions ‘ The data constitute another possible source of sector-wide information about quality, and descriptive information.... which is not currently exploited’. (HEFCE 2010:25)
15
ThemeSummary TeachingPositive enthusiasm / expertise Negative enthusiasm / skills AssessmentDeadline bunching FeedbackDelays Academic SupportStrong positive relationships - marking bias. Course organisationNegative- disruption communication and repetition in cross school programmes Value for moneyIssue for programmes with low contact hours Meta analysis ‘students did not generate any categories of comments that were associated with their perceptions of quality of their programmes and not addressed...’ (Richardson et al. 2007: 571)
16
Open response (free text) alignment: subject reporting group Text analytics tool Conceptual structure Visual representation
17
Course ModuleLectures SRG 1 99 FavourableUnfavourableFavourable SRG2 106 Favourable Placement unfavourable Favourable SRG 3 92 FavourableUnfavourableFavourable SRG4 56 FavourableUnfavourableFavourable Faculty Case Study
18
Course SRG 1 ‘The route section of my course has been fantastic’ SRG2 ‘The course provides us with opportunities to experience new activities and help us address current issues regarding our future profession.’ SRG 3 ‘They make the course feel so personal’ ‘Strong bond behind course year nurtured by trips away’ SRG4 ‘The course provided much scope and covered a wide range of subjects’
19
Module SRG 1 ‘Module did not always comply to what was expected of the course’ SRG2 ‘deadline dates for different modules can be close together if not on the same day’ SRG 3 ‘...where it is clear where one module supports another and where information can be transferred’ SRG4 ‘Some of the modules did not link to my degree subject’ ‘Timetable not evenly distributed throughout the year’ ‘I ended up rushing modules’ Outcomes: curriculum review; new timetable
20
Implications for quality enhancement HEFCE 2010 Rec. 8 ‘Development of analytical tool to enable institutions to analyse free text area of NSS in a consistent manner’ “Peace is not the absence of conflict, but the presence of justice.”
21
CHERI (2010) Enhancing and Developing the National Student Survey. London: HEFCEEnhancing and Developing the National Student Survey Cheng, J.H.S, and Marsh, H.W. (2010) 'National Student Survey: are differences between universities and courses reliable and meaningful?' Oxford Review of Education, 36(6): 693-712. Fielding, A., Dunleavy, P.J., and Langan, A.M (2010) Interpreting context to the UKs national student (Satisfaction) survey data for science subjects. Journal of Further and Higher Education, 34(3), 347-368. Flint, A., Oxley, A., Helm, P., and Bradley, S. (2009), “Preparing for success: one institution's aspirational and student focused response to the National Student Survey”, Teaching in Higher Education 14 (6): 607-618 HEFCE (2010), National Student Survey: Findings and trends 2006 to 2009. London: HEFCE Marsh, H.W., Cheng, J.H.S. (2008) Dimensionality, multilevel structure, and differentiation at the level of university and discipline: Preliminary results. HEA York Richardson, J.T. E., Slater, J. B. and Wilson, j. (2007) The National Student Survey: development, findings and implications. Studies in Higher Education, 32(5), 557–580. Surridge, P. (2009), The National Student Survey three years on: What have we learned? York: Higher Education Academy Surridge, P. (2006), The National Student Survey 2005: Findings. London: HEFCE Vaughan, D. And orke, M. (2009) I can’t believe it’s not better’: The Paradox of NSS scores for Art & Design. HEA York Williams, J. and Kane, D. (2008) Exploring the National Student Survey Assessment and feedback issues. HEA, York. Yorke, M. (2009) Student experience' surveys: some methodological considerations and an empirical investigation. Assessment and Evaluation in Higher Education, 34 (6), 21-739.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.