Presentation is loading. Please wait.

Presentation is loading. Please wait.

7th eSTEeM Annual Conference

Similar presentations


Presentation on theme: "7th eSTEeM Annual Conference"— Presentation transcript:

1 7th eSTEeM Annual Conference
Critical discussion of Student Evaluation scores and academic performance at the OU If you want to vote and share, log into: @DrBartRienties

2

3

4

5 Background of QAA Study 2015
HE increasingly competitive market: student satisfaction has become an important component of Quality Assurance (QA) and Quality Enhancement (QE, Kember & Ginns, 2012; Rienties, 2014). Measurement of student satisfaction is important to pinpoint strengths and identify areas for improvement (Coffey & Gibbs, 2001; Zerihun, Beishuizen, & Os, 2012). Potential benefits and drawbacks of student evaluations have been well-documented in the literature (see for example Bennett & De Bellis, 2010; Crews & Curtis, 2011), Recent research continues to suggest strong resistance amongst academic staff (Crews & Curtis, 2011; Moskal et al. 2015; Rienties, 2014). Most student survey instruments lack of focus on key elements of rich learning, such as interaction, assessment and feedback. Emerging body of literature questioning appropriateness of student satisfaction for measuring teacher effectiveness (Marsh, 2007; Li et al., 2016; Uttl et al., 2017) Rienties, B., Li, N., & Marsh, V. (2015). Modeling and managing student satisfaction: use of student feedback to enhance learning experience Subscriber Research Series Gloucester: Quality Assurance Agency.

6 Key Questions of the Project
To what extent are institutions using insights from NSS and institutional surveys to transform their students’ experience? What are the key enablers and barriers for integrating student satisfaction data with QA and QE How are student experiences influencing quality enhancements What influences students’ perceptions of overall satisfaction the most? Are student characteristics or module/presentation related factors more predictive than satisfaction with other aspects of their learning experience? Is the student cohort homogenous when considering satisfaction key drivers? For example are there systematic differences depending on the level or programme of study? Rienties, B., Li, N., & Marsh, V. (2015). Modeling and managing student satisfaction: use of student feedback to enhance learning experience Subscriber Research Series Gloucester: Quality Assurance Agency.

7

8 Methodology (Logistic Regression) & Validation
Step 1: A descriptive analysis was conducted to discount variables that were unsuitable for satisfaction modelling. Step 1 also identified highly correlated predictors and methodically selected the most appropriate. UG new, UG continuing, PG new and PG continuing students were modelled separately at Step 2. Step 2: Each subset of variables was modelled in groups. The variables that were statistically significant from each subset were then combined and modelled to identify the final list of key drivers We found at Step 3 that the combined scale provided the simplest and most interpretable solution for PG students and the whole scale for UG students. The solution without the KPI’s included was much easier to use in terms of identifying clear priorities for action. Step 3 Validation: all models have been verified by using subsets of the whole data to ensure the solutions are robust. A variety of model fit statistics have also been used to identify the optimum solutions. Module Presentation Student Concurrency Study history Overall Satisfaction SEaM

9 Good advice from teachers Links well to professional practice
According to 111,000+ students, what distinguishes excellent from good to not-so-good modules? Good advice from teachers Links well to professional practice Links well to qualifications Quality of teaching materials Quality of tutors Li, N., Marsh, V., Rienties, B., Whitelock, D. (2017). Online learning experiences of new versus continuing learners: a large scale replication study. Assessment & Evaluation in Higher Education, 42(4), Impact factor: 1.243

10 Li, N. , Marsh, V. , Rienties, B. , Whitelock, D. (2017)
Li, N., Marsh, V., Rienties, B., Whitelock, D. (2017). Online learning experiences of new versus continuing learners: a large scale replication study. Assessment & Evaluation in Higher Education, 42(4), Impact factor: 1.243

11 Students who successfully completed module
How does student satisfaction relate to module performance? Satisfaction Students who successfully completed module

12 Ullmann, T. , Lay, S. , Rienties, B. (2017)
Ullmann, T., Lay, S., Rienties, B. (2017). Data wranglers’ key metric report. IET Data Wranglers, Open University

13 Learning design data (>300 modules mapped) VLE data
Is satisfaction related to students’ behaviour and performance? Learning design data (>300 modules mapped) VLE data >140 modules aggregated individual data weekly >37 modules individual fine-grained data daily Student feedback data (>140) Academic Performance (>140) Predictive analytics data (>40) Data sets merged and cleaned 111,256 students undertook these modules Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 5131 students responded – 28%, between 18-76%

14 Level of study predict satisfaction
Learning design (finding info, productive, assessment) negatively predict satisfaction Assimilative learning design (benchmark) and interactive learning design positively predict satisfaction

15 Size of module and discipline predict completion
Satisfaction unrelated to pass rates Learning design (communication) predicts completion

16 Communication Student Satisfaction VLE Engagement Student retention
Constructivist Learning Design Assessment Learning Design VLE Engagement Productive Learning Design Week 1 Week 2 Week30+ Student retention Socio-construct. Learning Design Communication 150+ modules Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort). Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016),

17 Conclusions (Part I) Student satisfaction important for enhancing teaching and learning practice, but has limited relation to learning outcomes Learning design strongly influences student engagement, satisfaction and performance

18 Conclusions (Part II) How to improve our understanding of students
Talk to them (e.g., OU Live, discussion forum) Ask frequent feedback (e.g., online post box, discussion forum) How to interpret student evaluation findings? Use it as a developmental tool for your own teaching and learning Ask what other teachers learned

19 Critical discussion of Student Evaluation scores and academic performance at the OU
@DrBartRienties


Download ppt "7th eSTEeM Annual Conference"

Similar presentations


Ads by Google