Developing a revised approach to considering subject level NSS data – a Durham example Richard Harrison Head of the Academic Support Office 15 April 2011.

Slides:



Advertisements
Similar presentations
Learning from ELIR: piloting a new approach Thelma Barron, Assistant Director, QAA Scotland.
Advertisements

NSS 2013 Strategic Meeting of Senate 11 September 2013 Dr Veena O’Halloran Director of Student Experience and Enhancement Services.
Newcastle University School of Chemistry National Student Survey 2013 W hat is the National Student Survey? An annual, anonymous survey of final year undergraduates,
Staff and Departmental Development Unit Peer Observation of Teaching Dr Clara Davies, SDDU Annual Pro-Deans and Directors of Student Education Event 7.
The National Student Survey An introduction for course reps.
Office of Institutional Assessment and Effectiveness Workshop SUNY Oneonta Version 2, February 2014.
Understanding the postgraduate experience Chris Park Director, Lancaster University Graduate School Senior Associate, Higher Education Academy (HEA)
NSS: Components of Institutional Experience 29 th April 2010 Dr Alex Buckley The Higher Education Academy.
FOLLOW UP SITE VISIT Dr Robert Schofield Dr Arthur Brown Advisors to the Quality Assurance and Accreditation Project Republic of Egypt.
University Strategy Marcus Williams. Contents 0.1 Strategic goals 0.2 Re-positioning 0.3 Campus infrastructure 0.4 Sussex off campus 0.5 Malaysia Office.
Norm Wilkinson Worcester Polytechnic Institute & Dr Pam Parker City University London Curriculum Re-Design: Don’t just Survive, Thrive.
Continuous Improvement Monitoring (CIM) Collaborative Partner Forum Awareness Session June 2015.
If you don’t know where you’re going, any road will take you there.
Key features of the University of Manchester Professor Cathy Cassell Deputy Director (Academic) Sarah Featherstone Head of Undergraduate Services Original.
The National Student Survey (NSS) Penny Jones, Strategic Planning Office Tracy Goslar and Miles Willey, Academic Standards & Partnership Wednesday 16 March.
On-line briefing for Program Directors and Staff 1.
Results Student Engagement : Students generally found logbooks easy to use and practical in the hospital setting. Purpose : There appeared to be a perceived.
NSS as a Quality Enhancement Tool University of Strathclyde A case study Rowena Kochanowska and Anna Highmore.
The National Student Survey Kathy Jones – Membership Services Development Manager
Staff Survey Results Research Excellence Framework All Staff Open Meeting Monday 23 February 2015.
Some quality cycle planning monitoring and sharing examples 1.
What does it mean to be a top 10 University?
Modern Systems Analysis and Design Third Edition
Training for Taught Postgraduate Course Representatives
SCHOOL BASED SELF – EVALUATION
Taught Postgraduate Program Review
Information, Information, Information The Review of NSS and Unistats
System Office Performance Management
UCL Annual Student Experience Review
Student Surveys - Best Practice
UCL Peer Dialogue Scheme
Assessment & Evaluation Committee
Modern Systems Analysis and Design Third Edition
A nationwide US student survey
Director of Policy Analysis and Research
Lisa Dawson – Head of Student Systems Operations
What are the impact of changes to lesson observation and feedback processes on teachers’ attitudes and practice? Why this? The new process. What did we.
OFFICE OF SUSTAINABILITY
School Self-Evaluation 
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
School Self-Evaluation
SIMPLIFYING KENT/ PROGRAMME APPROVAL AND CURRICULUM DESIGN
The Pain and Promise of CRITICAL CORE Implementation
COMPLIMENTARY TEACHING MATERIALS
National Student Survey 2018
Fall Institute for Academic Deans and Department Chairs
Academic Promotion Information session, 22 March 2018.
All Staff Meeting Monday 24 October 2016
Linking assurance and enhancement
Enhancing Effective Assessment and Feedback
Recognising and Rewarding Successful Teaching
National Student Survey 2019
Enfield Reps Guide to the Workload Toolkit
Periodic Review Departmental Review.
Chapter 5 Identifying and Selecting Systems Development Projects
Assessment & Evaluation Committee
Modern Systems Analysis and Design Third Edition
Assessing Academic Programs at IPFW
Standard for Teachers’ Professional Development July 2016
Moving (positively) towards subject level TEF
A Model for Academic Engagement
Performance Management in the UK Civil Service
Modern Systems Analysis and Design Third Edition
Taught Postgraduate Program Review
THE CURRICULUM STUDY CHAP 3 – METHODOLOGY.
Fort Valley State University
PROGRAM REVIEW PROCESS
Advance HE Surveys Conference
Engage. Survey. Enhance. Repeat.
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

Developing a revised approach to considering subject level NSS data – a Durham example Richard Harrison Head of the Academic Support Office 15 April 2011

∂ Outline  Durham’s NSS results and aspirations  Consideration of results  Review of the existing approach  Revised approach 2010 onwards  Impact of revised approach

∂ Durham’s NSS results, Teaching Assessment and Feedback Academic Support Organisation and Management Learning Resources86 87 Personal Development Overall Satisfaction Response rate

∂ Durham’s NSS aspirations ‘To be in the top 5 UK multi-subject universities for student satisfaction, undergraduate qualifications on entry and undergraduate completion rates’ University Strategy, Education key targets:

∂ Durham’s comparative NSS performance, 2009 and All HEIs Excluding specialist institutions All HEIs Excluding specialist institutions Question 22 – Overall Satisfaction20 th 15 th 24 th 19 th Times Good University Guide23 rd 17 th 18 th 14 th Complete University Guide12 th 8 th 16 th 13 th

∂ Follow-up, : institutional  From NSS 2005: high-level summary of outcomes  From NSS 2007: Institution-level quantitative analysis of results for Questions 1 to 2  From NSS 2008: Institution-level qualitative analysis of answers to free text questions

∂ Follow-up, : subject level  Departmental action plans developed in Michaelmas Term o All questions with a mean below 3, or fewer than 70% agree o All questions where the mean fell by 0.5 or greater, or mean or % agreed fell for two consecutive years  Departmental action plans submitted to chairs of Faculty Education Committees for consideration  Faculty-level overviews of departmental results and action plans submitted to University Education Committee each January

∂ Why did we feel this wasn’t working?  Levelling off of results at institutional level  Lack of departmental engagement  Personal experience of academic staff that even when implemented well the University’s approach was not proving effective  Cumulative conclusions of institution-level statistical analyses of NSS 2007 to 2009

∂ Conclusion of institution-level statistical analyses NSS 2007: ‘Analysis of variability between departments suggests that there is a single underlying factor (effectively “overall experience”); that factor links the different subgroups of questions which otherwise vary largely independently from each other’ Report on analysis of NSS data, January 2008

∂ Conclusion of institution-level statistical analyses NSS 2008: ‘One interesting feature [of the comparator analysis] is that the shape of the profile for an institution generally changes relatively little but moves up and down from year to year. This corresponds to the observation, made last year and earlier in this report, that for Durham, the main difference between departments and across years is that scores on questions all tend to go up and down together, suggesting that effectively one overall factor (quality of experience) is being measured.’ Report on analysis of NSS data, January 2009

∂ Conclusion of institution-level statistical analyses NSS 2009: ‘the shape of the profile for many institutions generally changes relatively little but moves up and down from year to year … the main difference between departments and across years is that scores on questions all tend to go up and down together, suggesting that effectively one overall factor (quality of experience) is being measured. The difference in profile between institutions shows that there do seem to be real differences between institutions in aspects of perceived experience.’ Report on analysis of NSS data, January 2009

∂ Developing a revised approach ‘there comes a point where modernity begins to parody itself, pursuing answers without any sense of the original questions, proliferating devices for achieving ever greater “efficiency” in education as in other spheres’ N Blake, P Smeyers, R Smith and P Standish, Thinking Again: Education After Postmodernism (London, 1998), p.1

∂ Developing a revised approach ‘incremental improvements on specific NSS questions do not necessarily affect overall satisfaction, while improvements in overall satisfaction correlate with improvements across all other questions. It is therefore clearly a more effective use of time and energy to focus on improving overall student satisfaction as such, rather than working to raise scores on other individual NSS questions.’ Faculty Overview Report of NSS 2009

∂ Developing a revised approach Faculty NSS Manifesto:  Core principles underpinning all NSS-related activities o Communicate better with students o Manage obstacles and flashpoints o Focus on staff-student contact and high NSS impact activities  Faculty commitments to support departments in meeting these principles  Departmental commitments to undertake certain activities

∂ The revised approach  Faculty away-day to discuss key themes from NSS results  Development and agreement of a Faculty Manifesto  Consideration of Faculty Manifesto for approval by University Education Committee  Ongoing discussions between faculties and departments throughout the year  Reflective, backward looking commentary in the annual review report

∂ Impact of new approach NSS results 2009 and 2010: National ranking Qu. 22 Increase in score 2009 to 2010 National ranking 2009 National ranking 2010 Astrology+5%38/8814/84 Campanology+15%68/8513/86 Widgetology+6%44/5926/51

∂ Impact of new approach  Increased engagement from academic departments o Quality of discussion and engagement with issues o Genuinely strategic faculty level strategies o Ground-up staff development initiatives

∂ Questions ?