7th eSTEeM Annual Conference

Slides:



Advertisements
Similar presentations
© The Open University, Institute of Educational Technology 1 Alison Ashby, Naomi Jeffery, Anne Slee Student Statistics and Survey Team, IET The Open University.
Advertisements

Discerning Futures COURSE LEADERS’ CONFERENCE 2013.
Ritual or reality: do student evaluations have any effect on teacher thinking and practices? Presentation at the Australasian Higher Education Evaluation.
SE 450 Software Processes & Product Metrics Survey Use & Design.
Total ‘Student Experience’ Benchmarking:. Benchmarking© Tribal Education Limited 2005 What is it?  Tool to deliver;  Internal & External Benchmarking.
Enhancing Parents’ Role in Higher Education Assessment Anne Marie Delaney Director of Institutional Research, Babson College.
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
Dawne Gurbutt, Discipline Lead, Health Related Studies 11 th July 2013 Enhancing the student learning experience through Patient & Public Involvement Practice,
NASA Earth Observing System Data and Information Systems
Blackboard Learn Assessment and Feedback Ulster E-Learning Conference 20 th January 2011 Alan Masson & Fiona McCloy Access and Distributed Learning, University.
Learning Teaching and Assessment at University of Worcester Dr John Peters NTF Academic Development and Practice.
NSSE 2013 How to Use Results (or “Why you should care about NSSE”) 8/26/
Recognition of Prior Learning for Individuals and Organisations Andy Gibbs October 2013.
PRESENTATION AT THE TECHNOLOGICAL UNIVERSITIES QUALITY FRAMEWORK Professor Sarah Moore, Chair, National Forum for the Enhancement of Teaching and Learning.
The ABLE project: How do we put what we have learnt from learning analytics into practice Tinne De Laet, Head of Tutorial Services, Engineering Science,
Workshop For Reviewers Operating the Developmental Engagements Prof. Dr. Hala SalahProf. Dr. Hoda ELTalawy.
STUDENT DIVERSITY AND HOW IT RELATES TO STUDENT SUCCESS Dr. Michael Conyette.
Denise Kirkpatrick Pro Vice-Chancellor The Open University, UK Quality Assurance in Distance Education.
Learner experiences of online learning in a blended learning situation: Different cohorts, different needs Benjamin Kehrwald University of South Australia.
Pedagogy supplants technology to bridge the digital divide. Mat Schencks Lisette Toetenel Institute of Educational Technology and Technology Enhanced Learning,
The Student at the Heart of the Quality Assurance Process
Student engagement in quality: an introduction for staff
Carol Hedly High Potential Leadership Development Consultant
Monitoring, Annual Review & Enhancement
Dr Alex Buckley 16 February 2017, University of Stirling
DEPARTMENT OF HUMAN AND SOCIAL CIENCES APPLIED LINGUISTICS IN ENGLISH CAREER    “THE INFLUENCE OF TEACHER’S ATTITUDES AND BELIEFS INTO TECHNOLOGY-RELATED.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Alexandria City Public Schools Preliminary Results of the 2016 Teaching, Empowering, Leading, and Learning (TELL) Survey. Dawn Shephard Associate Director, Teaching,
Self Assessment for Pastoral Care
UCL Annual Student Experience Review
The Culture-Customer-Profit Chain
First-Year Experience Seminars: A Benchmark Study of Targeted Courses for Developmental Education Students.
Developing a Strategy for the Use of Learning Analytics
Innovative Teaching at Faughanvale P.S.
Student Engagement Data in the UK: Policy and Practice
New developments in the UK Higher Education
Engaging creative arts cultures in the scholarship of teaching
A nationwide US student survey
Links in the Chain: turning learning analytics data into actions
Assessment and Feedback – Module 1
Chetz Colwell, Tim Coughlan, Jane Seale
Quality and Standards An introduction.
Reflecting on Your Teaching – Using Learning outcomes to Critically Inform Your Teaching Content Alan Somerville ASSOCIATE DEAN LEARNING AND TEACHING,
Programme Review Dhaya Naidoo Director: Quality Promotion
European Network on teacher Education Policies
Director, Institutional Research
MDIC- Case for Quality Forum
2008 Conference on Information Technology Salt Lake, Utah
TEACHING PERFORMANCE STANDARDS FRAMEWORK
Social Change Implications
2015 International Development and Early Learning Assessment (IDELA)Baseline Results: ELM project Afar and South Omo, Ethiopia.
What we talk about when we talk about research into teaching
Introduction to CPD Quality Assurance
By Joseph Osunde & Anton Dil The Open University , United Kingdom
Dr Camille B. Kandiko Howson Academic Head of Student Engagement
Learning gain metrics and personal tutoring: Opportunities and ethics
Survey Design & Use.
Standard Four Program Impact
Developing an evaluation model to assess prevention measures (EVAPREM)
Purpose of Outcomes measurement
Learning gain metrics and personal tutoring: Opportunities and ethics
The Heart of Student Success
Understanding tutorial observation practice
Chris Douce and Sarah Chyriwsky
External Examiners Briefing Session Friday 14th December 2018
William Hasty (Quality and Enhancement Specialist, QAA Scotland)
The power of learning analytics to impact learning and teaching: a critical Professor of Learning Analytics Open University.
Workshop Set-Up: The aim is that at each table we have a variety of disciplines / subjects represented by (ideally) four participants. Ensure a mixture.
Utilising Module Evaluation data to explore outcomes from the Teaching Excellence and Student Outcomes Framework (TEF) Subject Level Pilot. Natalie Holland.
The implications of pedagogical innovation for academic practice in a large-scale online MBA Stuart Allan Director of Online Learning Edinburgh Business.
Presentation transcript:

7th eSTEeM Annual Conference Critical discussion of Student Evaluation scores and academic performance at the OU If you want to vote and share, log into: https://pollev.com/bartrienties552 @DrBartRienties

Background of QAA Study 2015 HE increasingly competitive market: student satisfaction has become an important component of Quality Assurance (QA) and Quality Enhancement (QE, Kember & Ginns, 2012; Rienties, 2014). Measurement of student satisfaction is important to pinpoint strengths and identify areas for improvement (Coffey & Gibbs, 2001; Zerihun, Beishuizen, & Os, 2012). Potential benefits and drawbacks of student evaluations have been well-documented in the literature (see for example Bennett & De Bellis, 2010; Crews & Curtis, 2011), Recent research continues to suggest strong resistance amongst academic staff (Crews & Curtis, 2011; Moskal et al. 2015; Rienties, 2014). Most student survey instruments lack of focus on key elements of rich learning, such as interaction, assessment and feedback. Emerging body of literature questioning appropriateness of student satisfaction for measuring teacher effectiveness (Marsh, 2007; Li et al., 2016; Uttl et al., 2017) Rienties, B., Li, N., & Marsh, V. (2015). Modeling and managing student satisfaction: use of student feedback to enhance learning experience Subscriber Research Series 2015-16. Gloucester: Quality Assurance Agency.

Key Questions of the Project To what extent are institutions using insights from NSS and institutional surveys to transform their students’ experience? What are the key enablers and barriers for integrating student satisfaction data with QA and QE How are student experiences influencing quality enhancements What influences students’ perceptions of overall satisfaction the most? Are student characteristics or module/presentation related factors more predictive than satisfaction with other aspects of their learning experience? Is the student cohort homogenous when considering satisfaction key drivers? For example are there systematic differences depending on the level or programme of study? Rienties, B., Li, N., & Marsh, V. (2015). Modeling and managing student satisfaction: use of student feedback to enhance learning experience Subscriber Research Series 2015-16. Gloucester: Quality Assurance Agency.

Methodology (Logistic Regression) & Validation Step 1: A descriptive analysis was conducted to discount variables that were unsuitable for satisfaction modelling. Step 1 also identified highly correlated predictors and methodically selected the most appropriate. UG new, UG continuing, PG new and PG continuing students were modelled separately at Step 2. Step 2: Each subset of variables was modelled in groups. The variables that were statistically significant from each subset were then combined and modelled to identify the final list of key drivers We found at Step 3 that the combined scale provided the simplest and most interpretable solution for PG students and the whole scale for UG students. The solution without the KPI’s included was much easier to use in terms of identifying clear priorities for action. Step 3 Validation: all models have been verified by using subsets of the whole data to ensure the solutions are robust. A variety of model fit statistics have also been used to identify the optimum solutions. Module Presentation Student Concurrency Study history Overall Satisfaction SEaM

Good advice from teachers Links well to professional practice According to 111,000+ students, what distinguishes excellent from good to not-so-good modules? Good advice from teachers Links well to professional practice Links well to qualifications Quality of teaching materials Quality of tutors Li, N., Marsh, V., Rienties, B., Whitelock, D. (2017). Online learning experiences of new versus continuing learners: a large scale replication study. Assessment & Evaluation in Higher Education, 42(4), 657-672. Impact factor: 1.243

Li, N. , Marsh, V. , Rienties, B. , Whitelock, D. (2017) Li, N., Marsh, V., Rienties, B., Whitelock, D. (2017). Online learning experiences of new versus continuing learners: a large scale replication study. Assessment & Evaluation in Higher Education, 42(4), 657-672. Impact factor: 1.243

Students who successfully completed module How does student satisfaction relate to module performance? Satisfaction Students who successfully completed module

Ullmann, T. , Lay, S. , Rienties, B. (2017) Ullmann, T., Lay, S., Rienties, B. (2017). Data wranglers’ key metric report. IET Data Wranglers, Open University

Learning design data (>300 modules mapped) VLE data Is satisfaction related to students’ behaviour and performance? Learning design data (>300 modules mapped) VLE data >140 modules aggregated individual data weekly >37 modules individual fine-grained data daily Student feedback data (>140) Academic Performance (>140) Predictive analytics data (>40) Data sets merged and cleaned 111,256 students undertook these modules Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 333-341 5131 students responded – 28%, between 18-76%

Level of study predict satisfaction Learning design (finding info, productive, assessment) negatively predict satisfaction Assimilative learning design (benchmark) and interactive learning design positively predict satisfaction

Size of module and discipline predict completion Satisfaction unrelated to pass rates Learning design (communication) predicts completion

Communication Student Satisfaction VLE Engagement Student retention Constructivist Learning Design Assessment Learning Design VLE Engagement Productive Learning Design Week 1 Week 2 Week30+ Student retention Socio-construct. Learning Design Communication 150+ modules Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort). Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 333-341

Conclusions (Part I) Student satisfaction important for enhancing teaching and learning practice, but has limited relation to learning outcomes Learning design strongly influences student engagement, satisfaction and performance

Conclusions (Part II) How to improve our understanding of students Talk to them (e.g., OU Live, discussion forum) Ask frequent feedback (e.g., online post box, discussion forum) How to interpret student evaluation findings? Use it as a developmental tool for your own teaching and learning Ask what other teachers learned

Critical discussion of Student Evaluation scores and academic performance at the OU @DrBartRienties