Welcome to today’s teaching and learning conversation with Prof

Slides:



Advertisements
Similar presentations
Drafting an Improvement Plan Using NSS Data Catherine Rendell – Deputy Director Academic Quality Assurance and Enhancement, University of Hertfordshire.
Advertisements

PTES 2009 – initial results PTES Officers meeting Birmingham.
Peer dialogues: A new approach to developing feedback practises? Dr. Sarah Richardson Ian Gwinn Sam McGinty.
Professor Craig Mahoney Deputy Vice-Chancellor.
Enhancing Student Engagement – what are we talking about? Graham Gibbs Research Centre for Student Engaged Educational Development.
T HE S TUDENT E XPERIENCE OF STEM VS N ON -STEM D EGREE P ROGRAMMES : A C OMPARATIVE S TUDY Chris Pawson.
External & Strategic Development Services (ESDS) NSS and Gender Comparison of : Male and female students Sector and UEL Year 2006 to 2009 Strategic Planning,
Analysis of Scottish NSS Results 29 th April 2010 Dr Alex Buckley The Higher Education Academy.
International Staff in UK Business Schools: Difficulties and Student Perception Magda Abou-Seada & Michael Sherer BMAF Teaching Research and Development.
School of something FACULTY OF OTHER Faculty of Arts ‘Fair, prompt & detailed’ – matching staff and student expectations on assessment and feedback in.
Improving Students’ understanding of Feedback
The reform of A level qualifications in the sciences Dennis Opposs SCORE seminar on grading of practical work in A level sciences, 17 October 2014, London.
EAIE Annual Conference 2011, Copenhagen Perils, pitfalls and new opportunities in international education Case study: The United Arab Emirates (UAE)
Understanding the postgraduate experience Chris Park Director, Lancaster University Graduate School Senior Associate, Higher Education Academy (HEA)
1.Rationale for using and engaging with wikis 2.Preparation for using wikis 3.Purpose and uses of wikis 4.Wiki to aid in assessment 5.Outcomes from using.
Dr Elena Luchinskaya, Lancaster University/ Leeds Metropolitan University, UK.
University Strategy Marcus Williams. Contents 0.1 Strategic goals 0.2 Re-positioning 0.3 Campus infrastructure 0.4 Sussex off campus 0.5 Malaysia Office.
Perspectives of student engagement Learning and Teaching Conference Faculty of Science and Engineering 30th March 2015 Mark Langan (School of Science.
1 What Students Need to Know from The National Student Survey 17 June 2010 Sami Benyahia, Director.
September 2013 University of Greenwich 1 BEng Software Engineering BSc Software Engineering BSc Computer Science BSc Computer Systems & Networking BSc.
Key features of the University of Manchester Professor Cathy Cassell Deputy Director (Academic) Sarah Featherstone Head of Undergraduate Services Original.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results Dr Pam Wells Adviser, Evidence-Informed Practice.
Jason Leman Education Researcher Sheffield Hallam University.
National Student Survey MMU Overview Neil Barrett Strategic Planning & Management Information.
Directorate of Human Resources The use of the Nominal Group Technique in evaluating student experience Diana Williams OCSLD.
Clare Saunders and Danielle Lamb Subject Centre for Philosophical and Religious Studies.
Extracting useful information from the UK’s National Student (Satisfaction) Survey Mark Langan, Alan Fielding and Peter Dunleavy Manchester Metropolitan.
National Student Survey Outcomes Medicine 2014 Professor Lindsay Bashford Director of Academic Undergraduate Studies.
While we are waiting to start please run through the Audio Setup Wizard (one last time ) 1.Click (top left) 2.Select the Audio Setup Wizard option 3.Follow.
Before you enrol for your second year If you have not already done so, please complete your Individual Learning Plan (learner ILP) on the college Portal,
Pedagogy supplants technology to bridge the digital divide. Mat Schencks Lisette Toetenel Institute of Educational Technology and Technology Enhanced Learning,
Maximising educational opportunities for the Libyan Health sector Mahdi Gibani MBE Consultant Nephrologist BC University Health Board.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
Summary of VCU Student Satisfaction Fall 2012
PARENTS’ INFORMATION SESSION -YEAR 6 SATS 2017
Evaluating students' perceptions of active learning pedagogies.
Information for Parents Key Stage 3 Statutory Assessment Arrangements
NETT Recruitment-Admissions Interactive Review Congruence Survey for case study 1 Relationship between recruitment and admissions activity.
Information, Information, Information The Review of NSS and Unistats
Student comments are just the start
UCL Annual Student Experience Review
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Developing the reflective learner -
Student Engagement Data in the UK: Policy and Practice
A nationwide US student survey
Partnership Forum 2017 Partner Institution Survey 2016 :
ASSESSMENT OF STUDENT LEARNING
SCHOOL OF ART DESIGN & MEDIA NSS POSITIVITY 2016
Meredith A. Henry, M.S. Department of Psychology
Improving inference and comprehension skills
Business and Management Research
Graduating Excellent Clinicians
Parent Pride Meeting IB Diploma Programme
PARENTS’ INFORMATION SESSION -YEAR 6 SATS 2017
Explanation of rating scales
Dr Claire Kotecki (STEM) & Dr Prithvi Shrestha (WELS)
Quality assurance and curriculum development
Assessment in Higher Education
Paul Kawachi e-Learning http : / / www . open - ed . net Home
Improving inference and comprehension skills
New Leaders Programme: Subject Leaders Day 2
Sir James Smith’s Community School
Summary of Overall Responses to Survey Statements
Assessment in Higher Education
2017 Postgraduate Research Experience Survey (PRES) Results
External Examiners Induction Welcome to UEL
Workshop Set-Up: The aim is that at each table we have a variety of disciplines / subjects represented by (ideally) four participants. Ensure a mixture.
A Moodle-based Peer Assessment Tool
Welcome to today’s teaching and learning conversation with Dr Rita Kizito Adopting a transformative curriculum approach to teaching in a global higher.
CIHE Annual Conference
Presentation transcript:

Welcome to today’s teaching and learning conversation with Prof Welcome to today’s teaching and learning conversation with Prof. Mark Langan Title: The S&M of HE: Surveys and Metrics If you can hear the presenter speaking please change your status to agree While we are waiting to start please take the opportunity to run through the Audio Setup Wizard (one last time ) Click “Meeting” (top left) Select the Audio Setup Wizard option Follow on-screen instructions Let us know if you are having a problem by typing “Help” into the Chat box and we’ll see what we can do Important Note At this point you may be able to hear and see Rod and Calum in the video pod but not any of the other participants or yourself. We have done this on purpose . Please bear with us while we get organised and take the time to run the audio set up wizard again. Tuesday 29th September 2014

We are about to start recording the webinar. Please note that we make TLC recordings publicly available at: https://tlcwebinars.wordpress.com/tlc-archive/

The S&M of HE: Surveys and Metrics Mark Langan (m.langan@mmu.ac.uk) MMU 24th May 2016

Something of me… Teacher (compulsively) Biologist (ecology and behaviour) HE researcher: L&T design and learner empowerment Student surveys (NSS, UKES) Learner engagement/disengagement (SoTL) Learning gains (RAND) Benchmarking HE (NHS) Playful learning (July 13th-15th 2016) About you – where are you based? http://www.rand.org/content/dam/rand/pubs/research_reports/RR900/RR996/RAND_RR996.pdf https://www.heacademy.ac.uk/research/surveys/united-kingdom-engagement-survey-ukes https://www.escholar.manchester.ac.uk/uk-ac-man-scw:1c7231 http://www.tandfonline.com/doi/pdf/10.3108/beej.11.1

Pre-course Post-course In-course Expectations NSS Experience KIS A level tariff Pre-entry experiences Recruitment Media Reputation Expectations NSS Engagement Retention Progression Attainment Good honours Experience ‘3P’ model (Biggs, 1993), approaches education as a complex system with ‘Presage’, ‘Process’ and ‘Product’ variables interacting with each other. The ‘3P’ model is essentially the same as that used by large-scale studies in the US (e.g. Astin, 1977, 1993): the ‘Input-Environment-Output’ model. Employment Alumni Collaboration Capacity Reputation Consolidation

When you hear the words “National Student Survey” (or “NSS”) – what is your reaction?

Aims of the NSS (original) to inform the choices of future students, alongside other sources of information about teaching quality 2. to contribute to public accountability by supporting external audits of institutions by the QAA.

What am I trying to achieve in HE? Students Learning goals (achievement, progression, motivation) Satisfaction with experience (wants and needs) Future success (employment, further education) Staff and Institution Quality assurance/enhancement Staff satisfaction and well-being (motivation) Productivity and success Reputation/league tables Financial security (resources and reputation) Am I missing anything?

So what does the NSS ‘measure’?

5 4 3 2 1 NA

Source: Hewson (2011) www.mathstore.ac.uk/headocs/Hewson.pdf … attempt to measure quality across disciplines ... you find that some disciplines emerge consistently better than others, across different studies and different institutions. Either one has to accept that certain subjects are always taught less well than others, which seems highly unlikely, or that different measures of quality are better aligned with the consequences of some (disciplinary) pedagogic practices than with others… Comparing quality between disciplines is fraught with difficulties. Gibbs, 2010, p.46 (this text is derived from Mantz Yorke http://www.gladhe.org.uk/wp-content/uploads/2014/05/GLAD_2012_M.-Yorke_NSS_A_Perfect_Storm.pdf)

2007 Langan, A.M., P.J. Dunleavy and A.F. Fielding (2013). Applying Models to National Surveys of Undergraduate Science Students: What Affects Ratings of Satisfaction? Education Sciences, 3, 193-207; doi:10.3390/educsci3020193.http://www.mdpi.com/journal/education Source: Langan et al (2013)

NSS ratings: Pakistani students compared to White British students Aftab Dean (Leeds Met Uni, 2010) Business subjects https://www.heacademy.ac.uk/sites/default/files/enhancing_the_student_experience.pdf Teaching Assessment & Academic Org &Man Learning Personal Feedback Support Resources Development Note Pakistani female students - lower NSS scores when living at home

Jacqueline H. S. Cheng & Herbert W Jacqueline H.S. Cheng & Herbert W. Marsh (2010) National Student Survey: are differences between universities and courses reliable and meaningful?, Oxford Review of Education, 36:6, 693-712, DOI: 10.1080/03054985.2010.491179

Which of the NSS questions (Q1-Q21) best predict the final question (Q22) “Overall I am satisfied with my course”? Originally asked to improve feedback to improve NSS Q22 (Alan Fielding)

5 4 3 2 1 NA

Effectiveness of Q1-21 to predict overall satisfaction (Q22) Predicting questionnaire item Inc MSE (%) Q15 - The course is well organised and is running smoothly 119.89 Q1 - Staff are good at explaining things 71.45 Q4 - The course is intellectually stimulating 66.71 Q14 - Any changes in the course or teaching have been communicated effectively 60.79 Q10 - I have received sufficient advice and support with my studies 55.34 Q11 - I have been able to contact staff when I needed to 43.40 Q3 - Staff are enthusiastic about what they are teaching 40.08 Q2 - Staff have made the subject interesting 38.26 Q12 - Good advice was available when I needed to make study choices 35.27 Subject 32.35 Q6 - Assessment arrangements and marking have been fair 20.10 Q17 - I have been able to access general IT resources when I needed to 18.73 Q19 - The course has helped me present myself with confidence 17.35 Q18 - I have been able to access specialised equipment, facilities or room when I 15.41 Q16 - The library resources and services are good enough for my needs 15.34 Q20 - My communication skills have improved 13.29 Q13 - The timetable works efficiently as far as my activities are concerned 13.16 Q7 - Feedback on my work has been prompt 10.49 Q9 - Feedback on my work has helped me clarify things I did not understand 6.65 Q5 - The criteria used in marking have been clear in advance 6.60 Q21 - As a result of the course, I feel confident in tackling unfamiliar problems 3.32 Q8 - I have received detailed comments on my work 3.04 Source: Langan et al (2013) Science subjects. MSE% when higher is a better predictor of Q22. All years are combined for analysis. Includes subject grouping. Point out lack of influence of feedback Qs on explaining residual variation (despite this being rated lower than many Qs). Subject is in there to be accounted for. A lot of variability within subjects (rem Biology slide of Alan’s) still a better predictor than most!

Source: Fielding et al (2010) Some questions are more related to each other (e.g. Teaching/Support) than others (e.g. Resources). Assessment and Feedback split for analyses Fielding, A.F., P.J. Dunleavy and A.M. Langan (2010) Effective use of the UK’s National Student (Satisfaction) Survey (NSS) data in science and engineering subjects. Journal of Further and Higher Education, 34, 347-368. Source: Fielding et al (2010)

NSS Feedback Qs Feedback Question Subject prompt detailed helpful Biological Sciences Physical Sciences ✔ Physical Geography Mathematical Sciences Computer Sciences Mechanically-based Engineering Electrical and Electronic Engineering Technology Human Geography Correlations (r) and p values (p) for the correlation between the level of agreement with AS AN EXAMPLE the three feedback questions (Q7 - Q9) and overall satisfaction Q22.

Predicting Q22 from Q1-Q21: winners and losers?

At MMU we run an internal survey (ISS) based on the ‘best predictors’ from the NSS survey from each dimension.

1 Staff on my course are good at explaining things 2 Feedback on my work helped me to clarify things I did not understand 3 I have received sufficient advice and support with my studies 4 The course is well organised and is running smoothly 5 University resources are appropriate to my learning needs 6 The course has helped me develop confidence and skills to succeed

MMU’s Internal Student Survey CONFIDENCE 18% ISS score for building confidence & skills ORGANISATION 16% ISS score for course organisation EXPLANATION 14% ISS score for explanation (teaching) ADVICE 13% ISS score for advice & support RESOURCES 11% ISS score for learning resources (library, IT, etc) FEEDBACK ISS score for assessment feedback CLEARING 3% Whether student entered via clearing OCC Year and mode of study JACS High level subject code FACULTY 2% Faculty

MMU’s Internal Student Survey Predictive Accuracy Factor ranked #1 Factor ranked #2 Factor ranked #3 MMU (73%) CONFIDENCE (18%) ORGANISATION (16%) EXPLANATION (14%) School of Art (65%) ORGANISATION (17%) ADVICE (15%) HPSC (72%) CONFIDENCE (19%) ORGANISATION (18%) Hollings (70%) EXPLANATION (15%) HLSS (72%) CONFIDENCE (22%) EXPLANATION (18%) Business & Law (74%) Sci & Eng (75%) CONFIDENCE (17%) EXPLANATION (17%) Cheshire (73%) CONFIDENCE (20%) ORGANISATION (19%)

… why? If you run surveys like this, do you respond more to the metrics or what was written in the comments?

Text comments - Ratios of: TEACH (staff) v ORG & MAN Higher achiever in NSS Frequency of comments Langan, A.M, N. Scott, S.N. Partington, and A. Oczujda (2015) Coherence between text comments and the quantitative ratings in the UK's National Student Survey. Journal of Further and Higher Education, 1-14. DOI: 10.1080/0309877X.2014.1000281. Lower achiever in NSS Langan et al (2015)

Faculty-level + TEACH - ORG & MAN - TEACH + ORG & MAN Langan, A.M, N. Scott, S.N. Partington, and A. Oczujda (2015) Coherence between text comments and the quantitative ratings in the UK's National Student Survey. Journal of Further and Higher Education, 1-14. DOI: 10.1080/0309877X.2014.1000281. - TEACH + ORG & MAN Below National Mean Above National Mean Langan et al (2015)

Casting a shadow? "a handful of lecturers are fantastic"... "varies from tutor to tutor"... "a few lecturers make it difficult…"

Does fear of ‘slipping in the metrics’ stifle L&T innovation?

Final thoughts http://quoteinvestigator.com/2010/05/26/everything-counts-einstein/Quote Investigator: QI suggests crediting William Bruce Cameron instead of Albert Einstein. Cameron’s 1963 text “Informal Sociology: A Casual Introduction to Sociological Thinking” contained the following passage, Boldface has been added to excerpts [WCIS]: It would be nice if all of the data which sociologists require could be enumerated because then we could run them through IBM machines and draw charts as the economists do. However, not everything that can be counted counts, and not everything that counts can be counted. “Not everything that can be counted counts, and not everything that counts can be counted” Not Albert Einstein

If there’s time…

“The best predictors of educational gain are measures of educational process…” http://www.heacademy.ac.uk/assets/documents/evidence_informed_practice/Dimensions_of_Quality.pdf

Source: University of Indiana http://nsse. indiana

‘Cuddle factor’: Individuals and masses Institution Processes and people