Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jason Leman Education Researcher Sheffield Hallam University.

Similar presentations


Presentation on theme: "Jason Leman Education Researcher Sheffield Hallam University."— Presentation transcript:

1 Jason Leman Education Researcher Sheffield Hallam University

2  The "student experience"?  Differences in expectations and prior ability  Differences in course within JACS  Differences in elective choices within course  Differences in course-mates and academic groups  Differences in support needs and goals  Differences in question interpretation and response  Differences in what part of their experience the student thinks about when filling in the questionnaire

3 From Marsh and Cheng (2008) Variability of NSS responses at JACS level 3 (when not able to control for student characteristics)

4 Are differences across the sector in NSS scores due to institution type or actual quality? TeachingDevelopment Assessment

5 % agree teachers are enthusiastic Average Tariff of Entry

6  Within the JACS 3 code, accounting has an average of three different course titles per institution, computer science has ten Ratio of titles to institutions

7

8 Correlation between Teaching and other factors on the NSS (Marsh and Cheng 2008)

9

10  There are consistent differences in how students respond to the NSS between groups of students, institutions, and subject areas;  What a JACS subject title such as "computer science" refers to, varies across the sector;  Different questions relate to different experiences and pedagogical practice, although not necessarily in ways we can simply interpret;  When benchmarking we either have to look at institutions that are teaching similar students, similar subjects in similar ways…  …or be very aware of the differences.

11

12 Question Million+ Post92 Universities Alliance Russell Group The Teaching on My Course 1. Staff are good at explaining things.1.8%-0.7%-3.7% 2. Staff have made the subject interesting.1.1%-0.9%-4.2% 3. Staff are enthusiastic about what they are teaching.1.5%-1.5%-5.0% 4. The course is intellectually stimulating.0.3%-2.0%-10.0% Assessment and Feedback 5. The criteria used in marking have been clear in advance.-0.4%-1.2%6.4% 6. Assessment arrangements and marking have been fair.0.6%-0.9%-1.8% 7. Feedback on my work has been prompt.-0.1%-2.1%0.2% 8. I have received detailed comments on my work.-2.3%-3.9%7.7% 9. Feedback on my work has helped me clarify things I did not understand. -3.8%-3.3%3.7% Difference between one institution's NSS scores and three University groups. Note the consistent difference around teaching factors and feedback. We can hypothesise that these differences are due to consistent differences in pedagogy, student type, and subjects taught at these different groups.

13 % satisfied with detail of feedback (NSS)

14 Question Competitors Comp average SHU 2010 to Comp Rank 13 Organisation and Management The timetable works efficiently as far as my activities are concerned. 81%-13% 8 of 8 14 Any changes in the course or teaching have been communicated effectively. 68%10% 3 of 8 15 The course is well organised and is running smoothly. 64%5% 3 of 8 22OverallOverall, I am satisfied with the quality of the course.79%7% 2 of 8 Key: Significant positive difference or trend Significant negative difference or trend Sample or expected response too low for robust statistical test Institutions have been selected based on levels of 2009 applications and if they have reported in 2010 for this subject group. The last three years of results from selected competitor institutions have been used to create the sector comparison score, weighted by response. This is to provide a relatively stable benchmark against which SHU can be compared with against over time. Scores compared to a group of competitor institutions. Competitors have been selected on the basis of cross applications. This guarantees a level of similarity with regard to subject, and also makes it likely that those institutions will report for that particular subject (unlike a University wide comparator list)

15 Against Competitors Against Institution

16 Question Trends 2009 to 20102008 to 20092008 to 2010 5 Assessment and Feedback The criteria used in marking have been clear in advance. 4%5%9% 6 Assessment arrangements and marking have been fair. 0%11% 7Feedback on my work has been prompt.3%5%8% 8I have received detailed comments on my work.1%14%15% 9 Feedback on my work has helped me clarify things I did not understand. -1%14%13% 22OverallOverall, I am satisfied with the quality of the course.3%14%17% Key: Significant positive difference or trend Significant negative difference or trend Sample or expected response too low for robust statistical test Trends over time alongside a test of significance. Trends can be a useful way of gauging performance against yourself. Tests of significance are important in reducing the likelihood subject areas will react to random variation in student responses. For stable courses trends may be the most relevant benchmark of all, but in themselves might not be a motivator for action.

17

18

19 "What best predicts educational gain is measures of educational process: what institutions do with their resources to make the most of whatever students they have … In the UK we have few data about the prevalence of these educational practices because they are not systematically documented through quality assurance systems, nor are they (in the main) the focus of the National Student Survey. Class size, the level of student effort and engagement, who undertakes the teaching, and the quantity and quality of feedback to students on their work are all valid process indicators. There is sufficient evidence to be concerned about all four of these indicators in the U K." Gibbs 2010

20  We need to benchmark with similar institutions to identify areas of concern;  Tests of significance need to be used, to reduce the impact of random variation;  The focus on reporting the NSS should be on raising questions, answered through more sophisticated evidence;  The prime use of the NSS should be as a lever for implementing known good educational practice.


Download ppt "Jason Leman Education Researcher Sheffield Hallam University."

Similar presentations


Ads by Google