Learning from the NSS: can’t get no satisfaction…… "This year's National Student Survey is a wake-up call to university vice-chancellors. They must buck.

Slides:



Advertisements
Similar presentations
Academic review of HE in FECs. Student support: an overview Alan Bradshaw Assistant Director QAA.
Advertisements

Feedback sessions - Helping first year students get the most out of assessment and feedback Sue R Whittle & Linda B Bonnett Faculty of Biological Sciences.
Learning through Service Community Service-Learning at the University of Guelph Cheryl Rose, CSL Specialist, Student Life Executive Director, Canadian.
NSS Assessment and Feedback Scores: Success Stories Serena Bufton.
National Curriculum Framework n Exploring and Developing ideas n Investigating and making art, craft & design n Evaluating and developing work n Knowledge.
Keeping the students satisfied: a longitudinal, comparative institutional analysis of survey free- text comments Elena Zaitseva, Clare Milsom, Martyn Stewart.
The Forgotten Year: using a mixed method institutional research to tackle the ‘Sophomore Slump’ Elena Zaitseva, Clare Milsom, Martyn Stewart Liverpool.
External & Strategic Development Services (ESDS) NSS and Gender Comparison of : Male and female students Sector and UEL Year 2006 to 2009 Strategic Planning,
Analysis of Scottish NSS Results 29 th April 2010 Dr Alex Buckley The Higher Education Academy.
Education Studies Degrees and Employability A HEFCE / ESCalate project by Julie Anderson & Helena Mitchell.
Graduate Program Assessment Report. University of Central Florida Mission Communication M.A. Program is dedicated to serving its students, faculty, the.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Deb Hearle and Nina Cogger.  Cardiff University: Periodic Review & Re-validation  Health Professions Council: Re-Approval  College of Occupational.
What is the National Student Survey?. Your opportunity to ‘Have Your Say’ and provide invaluable feedback to your institution and future students that.
ACADEMIC INFRASTRUCTURE Framework for Higher Education Qualifications Subject Benchmark Statements Programme Specifications Code of Practice (for the assurance.
Business research methods: data sources
The reform of A level qualifications in the sciences Dennis Opposs SCORE seminar on grading of practical work in A level sciences, 17 October 2014, London.
The Graduate Attributes Project: a perspective on early stakeholder engagement Dr Caroline Walker Queen Mary, University of London.
+ Teaching psychological research methods through a pragmatic and programmatic approach. Patrick Rosenkranz, Amy Fielden, Efstathia Tzemou.
Standards and Guidelines for Quality Assurance in the European
1 WHAT IS THE NATIONAL STUDENT SURVEY?. 2 What is the NSS? Your opportunity to ‘Leave Your Mark’ and provide invaluable feedback to your institution and.
 E-learning forum Thursday 12 th May Introductions Daniel Clark University Learning Technologist Louise.
PTES 2014 Update. 112 Expressions of interest to date (97 EoIs in 2013 of whom 89 took part) Earliest launch date: 3 February Latest launch date: 30 April.
Understanding the postgraduate experience Chris Park Director, Lancaster University Graduate School Senior Associate, Higher Education Academy (HEA)
Principles of Assessment
NSS: Components of Institutional Experience 29 th April 2010 Dr Alex Buckley The Higher Education Academy.
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
Workforce Engagement Survey Accessing your survey results and focussing on key messages in the survey data.
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
University Strategy Marcus Williams. Contents 0.1 Strategic goals 0.2 Re-positioning 0.3 Campus infrastructure 0.4 Sussex off campus 0.5 Malaysia Office.
Creating Entrepreneurship: entrepreneurship education for the creative industries David Clews Subject Centre Manager Higher Education Academy Art | Design.
1 What Students Need to Know from The National Student Survey 17 June 2010 Sami Benyahia, Director.
Prof. György BAZSA, former president Hungarian Accreditation Committee (HAC) CUBRIK Workshop IV Beograd, 13 March, 2012 European Standards and Guidelines.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Good Assessment by Design International GCSE and GCE Comparative Analyses Dr. Rose Clesham.
The National Student Survey (NSS) Penny Jones, Strategic Planning Office Tracy Goslar and Miles Willey, Academic Standards & Partnership Wednesday 16 March.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results Dr Pam Wells Adviser, Evidence-Informed Practice.
Engaging Students in Quality Assurance: The Challenge of Embedding Unit Feedback Processes and Enhancing the Student Learning Experience. Sara Briscoe,
Jason Leman Education Researcher Sheffield Hallam University.
Students’ and Faculty’s Perceptions of Assessment at Qassim College of Medicine Abdullah Alghasham - M. Nour-El-Din – Issam Barrimah Acknowledgment: This.
Educators’ Attitudes about the Accessibility and Integration of Technology into the Secondary Curriculum Dr. Christal C. Pritchett Auburn University
Learning from the NSS: can’t get no satisfaction…… "This year's National Student Survey is a wake-up call to university vice-chancellors. They must buck.
ATAA Presentation 19 th November 2014 Bruce Vanstone.
Clare Saunders and Danielle Lamb Subject Centre for Philosophical and Religious Studies.
Professional Learning and Development: Best Evidence Synthesis Helen Timperley, Aaron Wilson and Heather Barrar Learning Languages March 2008.
Extracting useful information from the UK’s National Student (Satisfaction) Survey Mark Langan, Alan Fielding and Peter Dunleavy Manchester Metropolitan.
Providing mentor support for practice educators in training Exploring and evaluating approaches used by Bournemouth University 2010.
Preparing Future Teachers for 21 st Century Learning Partnerships that enhance the capacity of pre-service education 2008 Deakin University Faculty of.
LMI Learning module – Overview © Institute for Employment Research, University of Warwick.
Centre for Educational Development ORHEP Project 1 This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported.
Exploration of the Academic Experience of International Students Studying Project Management *Dr Reda M Lebcir, Hany Wells and Angela Bond The Business.
Planning for NSS 2013 School of Medicine Manjit Bansal.
N ational Q ualifications F ramework N Q F Quality Center National Accreditation Committee.
1 What is the National Student Survey (NSS) and why should I take part?
Making Assessment Feedback Manageable Professor Carol Evans
The Role of the Internal and External Evaluators in Student Assessment Arthur Brown Advisor to the Quality Assurance and Accreditation Project Republic.
Monday, June 23, 2008Slide 1 KSU Females prospective on Maternity Services in PHC Maternity Services in Primary Health Care Centers : The Females Perception.
Using the NSS to enhance teaching quality 22 nd June 2011 Dr Alex Buckley The Higher Education Academy.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Some quality cycle planning monitoring and sharing examples 1.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
Dundee Ready Educational Environment Measure (DREEM)
Partnership Forum 2017 Partner Institution Survey 2016 :
Graduating Excellent Clinicians
Researching Assessment Practices
Quality assurance and curriculum development
2017 Postgraduate Research Experience Survey (PRES) Results
Utilising Module Evaluation data to explore outcomes from the Teaching Excellence and Student Outcomes Framework (TEF) Subject Level Pilot. Natalie Holland.
Presentation transcript:

Learning from the NSS: can’t get no satisfaction…… "This year's National Student Survey is a wake-up call to university vice-chancellors. They must buck up their ideas and do far more to improve the experience they offer students.” Aaron Porter NUS president Can’t get no satisfaction: Discrepancies between NSS qualitative and quantitative data. Implications for quality enhancement Dr Clare Milsom, Dr Martyn Stewart, Dr Elena Zaitseva Academic Enhancement Unit Liverpool JMU Surveys for Enhancement 2011

Purposes of the NSS 1.Quality assurance 2.Student choice 3.Improvement of the student learning experience (quality enhancement) “This year's National Student Survey is a wake-up call to university vice-chancellors. They must buck up their ideas and do far more to improve the experience they offer students.“ Aaron Porter July 2010

The structure of the NSS questionnaire is appropriate (dimensionality of the data) 5.Definitely agree 4.Mostly agree 3.Neither agree nor disagree 2.Mostly disagree 1.Definitely disagree

‘Factor analysis identified the six NSS factors that the NSS was designed to measure but it also considered that assessment and feedback should be considered as separate factors’ (Marsh and Cheng 2008:6) ‘broad evidence for the integrity of the six original scales, and there was some evidence for two additional dimensions relating to feedback and workload’ (Richardson et al. 2007: 578)

The NSS questionnaire shows satisfactory levels of internal consistency and validity 5.Definitely agree 4.Mostly agree 3.Neither agree nor disagree 2.Mostly disagree 1.Definitely disagree

‘..proved to be remarkably robust’ ‘satisfactory levels of internal consistency...construct validity...and concurrent validity’ (Richardson et al. 2007: 578) ‘our exploration of the NSS questionnaire indicated strong convergent validity..and discriminant validity’ (Fielding et al. 2010: 359) ‘Analysis of NSS data has repeatedly provided evidence the robustness of the instrument and conceptual validity of the dimensions of the student experience is assesses’ (HEFCE2010: 9)

NSS responses are able to differentiate between institutions 5.Definitely agree 4.Mostly agree 3.Neither agree nor disagree 2.Mostly disagree 1.Definitely disagree

‘..2.5% of the variation in NSS scores for Overall Satisfaction is due to differences between institutions’. (Surridge 2010: 25) ‘there is much more variation in the responses by students within each university than there is between the different universities’ (Marsh and Cheng 2008:52). ‘we recommend that NSS ratings should only be used with appropriate caution for comparing universities.’ (Cheng and Marsh 2010: 709) Assessment and Feedback More satisfiedLess satisfied > 25others InternationalEU Teaching and Learning More satisfiedLess satisfied White ethic bgAll minority ethic bg Williams and Kane (2008) and HEFCE (2010)

NSS responses are able to differentiate between subjects of study 5.Definitely agree 4.Mostly agree 3.Neither agree nor disagree 2.Mostly disagree 1.Definitely disagree

‘especially to comparisons of different courses—either different courses within the same university or the same course across universities.’ (Cheng and Marsh 2010: 709) ‘complex relationship between survey questions and subject studied was found in many areas’ (Fielding et al. 2010: 365) ’The students with the highest satisfaction score were those undertaking historical and philosophical studies or physical sciences.... Those studying creative arts and design gave the lowest score...’ (HEFEC 2010 Findings and Trends ) ‘because of the differences between subjects, it is not sensible to expect similar profiles of scores across all subjects.’ (Vaughan and Yorke 2009:32)

Commenting on the results from the National Student Survey 2010, Universities and Science Minister David Willetts said: ‘I want to ensure prospective students have comprehensive, up-to- date and comparable information about what is on offer.’ ‘It is desirable to make available clear guidance about the risks and issues associated with using NSS results for the purposes of comparison’ HEFCE July 2010 ‘Focus on the what steps need to be taken to move forward and improve rather than concentrating on over analysing the scores’ ‘Aspirational and evidence-based approaches.’ Flint et al. (2009)

‘Where a survey is intended to be used formatively..... enhancement activities.... an instrument of reasonable quality may be ‘good enough’ for the purpose’ ‘If the survey is summative in intention (NSS) its technical qualities become more important’ (Yorke 2010:734)

Teaching on my course % Assess and feedback % Academic support % Organisation and management % Learning resources % Personal development % Overall Satisfaction % 2010 Diff Diff Diff Diff Diff Diff Diff 09 80%-460%-471%-678%-280%-775%-682%-5 75%62%070%-474%-272%74%-379%-2 80%-365%-771%-550%-1473%-585%277%2 70%252%169%174%678%-373%476%6 79%065%-371%73%2 -373%-274%-3 76%-559%-467%663%-367%-778%-472%-6 Faculty LJMU NSS approach: Formative intention with summative action

Subject Reporting Group: content and discourse analysis Positive Frequency Positive Strength Negative Frequency Negative Strength Teaching Assessment Feedback Academic support Timetabling and deadlines Course organisation + facilities LRC + access to PCs PDP + skills Employability + placement ICT Accessibility issues Value for Money 2011 LJMU Academic Enhancement Unit Qualitative analysis of open response (free text) questions ‘ The data constitute another possible source of sector-wide information about quality, and descriptive information.... which is not currently exploited’. (HEFCE 2010:25)

ThemeSummary TeachingPositive enthusiasm / expertise Negative enthusiasm / skills AssessmentDeadline bunching FeedbackDelays Academic SupportStrong positive relationships - marking bias. Course organisationNegative- disruption communication and repetition in cross school programmes Value for moneyIssue for programmes with low contact hours Meta analysis ‘students did not generate any categories of comments that were associated with their perceptions of quality of their programmes and not addressed...’ (Richardson et al. 2007: 571)

Open response (free text) alignment: subject reporting group Text analytics tool Conceptual structure Visual representation

Course ModuleLectures SRG 1 99 FavourableUnfavourableFavourable SRG2 106 Favourable Placement unfavourable Favourable SRG 3 92 FavourableUnfavourableFavourable SRG4 56 FavourableUnfavourableFavourable Faculty Case Study

Course SRG 1 ‘The route section of my course has been fantastic’ SRG2 ‘The course provides us with opportunities to experience new activities and help us address current issues regarding our future profession.’ SRG 3 ‘They make the course feel so personal’ ‘Strong bond behind course year nurtured by trips away’ SRG4 ‘The course provided much scope and covered a wide range of subjects’

Module SRG 1 ‘Module did not always comply to what was expected of the course’ SRG2 ‘deadline dates for different modules can be close together if not on the same day’ SRG 3 ‘...where it is clear where one module supports another and where information can be transferred’ SRG4 ‘Some of the modules did not link to my degree subject’ ‘Timetable not evenly distributed throughout the year’ ‘I ended up rushing modules’ Outcomes: curriculum review; new timetable

Implications for quality enhancement HEFCE 2010 Rec. 8 ‘Development of analytical tool to enable institutions to analyse free text area of NSS in a consistent manner’ “Peace is not the absence of conflict, but the presence of justice.”

CHERI (2010) Enhancing and Developing the National Student Survey. London: HEFCEEnhancing and Developing the National Student Survey Cheng, J.H.S, and Marsh, H.W. (2010) 'National Student Survey: are differences between universities and courses reliable and meaningful?' Oxford Review of Education, 36(6): Fielding, A., Dunleavy, P.J., and Langan, A.M (2010) Interpreting context to the UKs national student (Satisfaction) survey data for science subjects. Journal of Further and Higher Education, 34(3), Flint, A., Oxley, A., Helm, P., and Bradley, S. (2009), “Preparing for success: one institution's aspirational and student focused response to the National Student Survey”, Teaching in Higher Education 14 (6): HEFCE (2010), National Student Survey: Findings and trends 2006 to London: HEFCE Marsh, H.W., Cheng, J.H.S. (2008) Dimensionality, multilevel structure, and differentiation at the level of university and discipline: Preliminary results. HEA York Richardson, J.T. E., Slater, J. B. and Wilson, j. (2007) The National Student Survey: development, findings and implications. Studies in Higher Education, 32(5), 557–580. Surridge, P. (2009), The National Student Survey three years on: What have we learned? York: Higher Education Academy Surridge, P. (2006), The National Student Survey 2005: Findings. London: HEFCE Vaughan, D. And orke, M. (2009) I can’t believe it’s not better’: The Paradox of NSS scores for Art & Design. HEA York Williams, J. and Kane, D. (2008) Exploring the National Student Survey Assessment and feedback issues. HEA, York. Yorke, M. (2009) Student experience' surveys: some methodological considerations and an empirical investigation. Assessment and Evaluation in Higher Education, 34 (6),