Jason Leman Education Researcher Sheffield Hallam University.

Slides:



Advertisements
Similar presentations
Key Stage 3 National Strategy
Advertisements

Drafting an Improvement Plan Using NSS Data Catherine Rendell – Deputy Director Academic Quality Assurance and Enhancement, University of Hertfordshire.
Professor Craig Mahoney Deputy Vice-Chancellor.
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Enhancing Student Engagement – what are we talking about? Graham Gibbs Research Centre for Student Engaged Educational Development.
Retention and the first-year student experience of higher education in the UK Bernard Longden.
T HE S TUDENT E XPERIENCE OF STEM VS N ON -STEM D EGREE P ROGRAMMES : A C OMPARATIVE S TUDY Chris Pawson.
Mining NSS data for enhancement An example from Art & Design Mantz Yorke Higher Education Academy Conference “Higher Education.
Project Monitoring Evaluation and Assessment
External & Strategic Development Services (ESDS) NSS and Gender Comparison of : Male and female students Sector and UEL Year 2006 to 2009 Strategic Planning,
Discerning Futures COURSE LEADERS’ CONFERENCE 2013.
Analysis of Scottish NSS Results 29 th April 2010 Dr Alex Buckley The Higher Education Academy.
Alex Bols, Assistant Director (Research) & Head of Higher Education, NUS Assessment feedback: changing student attitudes.
Assessment and Feedback
School of something FACULTY OF OTHER Faculty of Arts ‘Fair, prompt & detailed’ – matching staff and student expectations on assessment and feedback in.
Improving Students’ understanding of Feedback
Benchmarks and Benchmarking in the UK - Lessons Learned Catherine Connor Quality Enhancement Unit London Metropolitan University.
PTES 2014 Update. 112 Expressions of interest to date (97 EoIs in 2013 of whom 89 took part) Earliest launch date: 3 February Latest launch date: 30 April.
Understanding the postgraduate experience Chris Park Director, Lancaster University Graduate School Senior Associate, Higher Education Academy (HEA)
Business and Management Research
Comparing Generic Student Learning Outcomes among Fresh Graduates in the Workplace Comparing Generic Student Learning Outcomes among Fresh Graduates in.
Northampton – Development Opportunities a framework for enabling positive change.
University Strategy Marcus Williams. Contents 0.1 Strategic goals 0.2 Re-positioning 0.3 Campus infrastructure 0.4 Sussex off campus 0.5 Malaysia Office.
1 What Students Need to Know from The National Student Survey 17 June 2010 Sami Benyahia, Director.
Student Engagement Survey Results and Analysis June 2011.
The Teaching International Students Project. Run by the Higher Education Academy Funded through the Academy, UKCISA & PMI2 2 year project TIS Team: Janette.
Key features of the University of Manchester Professor Cathy Cassell Deputy Director (Academic) Sarah Featherstone Head of Undergraduate Services Original.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 DOMAIN-REFERENCING Specify the domain – the content field – that is being.
The National Student Survey (NSS) Penny Jones, Strategic Planning Office Tracy Goslar and Miles Willey, Academic Standards & Partnership Wednesday 16 March.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results Dr Pam Wells Adviser, Evidence-Informed Practice.
National Student Survey MMU Overview Neil Barrett Strategic Planning & Management Information.
On-line briefing for Program Directors and Staff 1.
Learning from the NSS: can’t get no satisfaction…… "This year's National Student Survey is a wake-up call to university vice-chancellors. They must buck.
Teacher Engagement Survey Results and Analysis June 2011.
Clare Saunders and Danielle Lamb Subject Centre for Philosophical and Religious Studies.
Extracting useful information from the UK’s National Student (Satisfaction) Survey Mark Langan, Alan Fielding and Peter Dunleavy Manchester Metropolitan.
National Student Survey Outcomes Medicine 2014 Professor Lindsay Bashford Director of Academic Undergraduate Studies.
Assessment & Feedback The Devil is in the Detail. David Kane Birmingham City University
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
Primary.  There was a greater level of improvement in Literacy than Numeracy for both FSME and Non-FSME pupils.  Boys showed a greater level of.
Kerry Cleary An evaluation of the impact of Values Based Interviewing at the OUH Values Based Conversations and wider engagement strategies.
The National Student Survey (NSS) 2015 University of Edinburgh – Initial Findings Student Surveys Unit 7 th August
Student Expectations of Assessment and Feedback Linda Juleff, Sam Ling and Becky Stone Southampton Solent University.
UCL Arena Exchange Seminar Improving assessment and feedback scores.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
Framework for Excellence What is the Framework? The Framework is the Governments National Assessment Framework for Education and Training Public.
Economics Network Assessment and Feedback.
The Right Quality Measuring what counts Learning Metrics, Learning Analytics: using data to improve the student experience TILT December 2015 Susannah.
Student Voice: a catalyst for change Jane Collings and Polly Magne Educational Development & PeDRIO.
School of Biological Sciences Staff Survey 2013 Department of Zoology Results Briefing, 21 May 2013.
Key messages from Verification  Use valid and reliable assessments  SQA-produced Unit Assessment Support Packs  Centre devised assessments.
KEVIN SMITH & KIM HORTON JULY 2015 Educational research and teaching Wales.
Using the NSS to enhance teaching quality 22 nd June 2011 Dr Alex Buckley The Higher Education Academy.
Nicki Horseman Lead HE analyst Times Higher Education.
Maximising educational opportunities for the Libyan Health sector Mahdi Gibani MBE Consultant Nephrologist BC University Health Board.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
Information, Information, Information The Review of NSS and Unistats
Feedback and Assessment: Could Do Better
Partnership Forum 2017 Partner Institution Survey 2016 :
SCHOOL OF ART DESIGN & MEDIA NSS POSITIVITY 2016
Business and Management Research
Graduating Excellent Clinicians
Susan Rhind, Neil Lent, Kirsty Hughes, Jill MacKay
Finding Answers through Data Collection
Consider the Evidence Evidence-driven decision making
Business and Management Research
Summary of Overall Responses to Survey Statements
2017 Postgraduate Research Experience Survey (PRES) Results
Presentation transcript:

Jason Leman Education Researcher Sheffield Hallam University

 The "student experience"?  Differences in expectations and prior ability  Differences in course within JACS  Differences in elective choices within course  Differences in course-mates and academic groups  Differences in support needs and goals  Differences in question interpretation and response  Differences in what part of their experience the student thinks about when filling in the questionnaire

From Marsh and Cheng (2008) Variability of NSS responses at JACS level 3 (when not able to control for student characteristics)

Are differences across the sector in NSS scores due to institution type or actual quality? TeachingDevelopment Assessment

% agree teachers are enthusiastic Average Tariff of Entry

 Within the JACS 3 code, accounting has an average of three different course titles per institution, computer science has ten Ratio of titles to institutions

Correlation between Teaching and other factors on the NSS (Marsh and Cheng 2008)

 There are consistent differences in how students respond to the NSS between groups of students, institutions, and subject areas;  What a JACS subject title such as "computer science" refers to, varies across the sector;  Different questions relate to different experiences and pedagogical practice, although not necessarily in ways we can simply interpret;  When benchmarking we either have to look at institutions that are teaching similar students, similar subjects in similar ways…  …or be very aware of the differences.

Question Million+ Post92 Universities Alliance Russell Group The Teaching on My Course 1. Staff are good at explaining things.1.8%-0.7%-3.7% 2. Staff have made the subject interesting.1.1%-0.9%-4.2% 3. Staff are enthusiastic about what they are teaching.1.5%-1.5%-5.0% 4. The course is intellectually stimulating.0.3%-2.0%-10.0% Assessment and Feedback 5. The criteria used in marking have been clear in advance.-0.4%-1.2%6.4% 6. Assessment arrangements and marking have been fair.0.6%-0.9%-1.8% 7. Feedback on my work has been prompt.-0.1%-2.1%0.2% 8. I have received detailed comments on my work.-2.3%-3.9%7.7% 9. Feedback on my work has helped me clarify things I did not understand. -3.8%-3.3%3.7% Difference between one institution's NSS scores and three University groups. Note the consistent difference around teaching factors and feedback. We can hypothesise that these differences are due to consistent differences in pedagogy, student type, and subjects taught at these different groups.

% satisfied with detail of feedback (NSS)

Question Competitors Comp average SHU 2010 to Comp Rank 13 Organisation and Management The timetable works efficiently as far as my activities are concerned. 81%-13% 8 of 8 14 Any changes in the course or teaching have been communicated effectively. 68%10% 3 of 8 15 The course is well organised and is running smoothly. 64%5% 3 of 8 22OverallOverall, I am satisfied with the quality of the course.79%7% 2 of 8 Key: Significant positive difference or trend Significant negative difference or trend Sample or expected response too low for robust statistical test Institutions have been selected based on levels of 2009 applications and if they have reported in 2010 for this subject group. The last three years of results from selected competitor institutions have been used to create the sector comparison score, weighted by response. This is to provide a relatively stable benchmark against which SHU can be compared with against over time. Scores compared to a group of competitor institutions. Competitors have been selected on the basis of cross applications. This guarantees a level of similarity with regard to subject, and also makes it likely that those institutions will report for that particular subject (unlike a University wide comparator list)

Against Competitors Against Institution

Question Trends 2009 to to to Assessment and Feedback The criteria used in marking have been clear in advance. 4%5%9% 6 Assessment arrangements and marking have been fair. 0%11% 7Feedback on my work has been prompt.3%5%8% 8I have received detailed comments on my work.1%14%15% 9 Feedback on my work has helped me clarify things I did not understand. -1%14%13% 22OverallOverall, I am satisfied with the quality of the course.3%14%17% Key: Significant positive difference or trend Significant negative difference or trend Sample or expected response too low for robust statistical test Trends over time alongside a test of significance. Trends can be a useful way of gauging performance against yourself. Tests of significance are important in reducing the likelihood subject areas will react to random variation in student responses. For stable courses trends may be the most relevant benchmark of all, but in themselves might not be a motivator for action.

"What best predicts educational gain is measures of educational process: what institutions do with their resources to make the most of whatever students they have … In the UK we have few data about the prevalence of these educational practices because they are not systematically documented through quality assurance systems, nor are they (in the main) the focus of the National Student Survey. Class size, the level of student effort and engagement, who undertakes the teaching, and the quantity and quality of feedback to students on their work are all valid process indicators. There is sufficient evidence to be concerned about all four of these indicators in the U K." Gibbs 2010

 We need to benchmark with similar institutions to identify areas of concern;  Tests of significance need to be used, to reduce the impact of random variation;  The focus on reporting the NSS should be on raising questions, answered through more sophisticated evidence;  The prime use of the NSS should be as a lever for implementing known good educational practice.