Developing an early warning system combined with dynamic LMS data

Slides:



Advertisements
Similar presentations
Using Growth Models to improve quality of school accountability systems October 22, 2010.
Advertisements

Enhance the Attractiveness of Studies in Science and Technology WP 6: Formal Hinders Kevin Kelly Trinity College Dublin WP 6 Co-ordinator.
Freehold Borough Teacher Evaluation System Freehold Intermediate School Friday – February 15, 2013 Rich Pepe Director of Curriculum & Instruction.
Math/Physics Competency Test as a Diagnostic Tool in Undergraduate Mechanical Engineering Curriculum C. Shih and N. Chandra* Department of Mechanical Engineering.
Using Data to Manage Interventions Managing student supports within City Year and Diplomas Now.
Study on the outcomes of teaching and learning about ‘race’ and racism Kish Bhatti-Sinclair (Division of Social Work Studies) Claire Bailey (Division of.
Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.
Gordon Associates Why Do We Do What We Do, And How Do We Know if it’s Working? By Ron Gordon, EdD.
Student Growth Percentile Model Question Answered
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
Working Together for Student Retention SSAO/VPAA/CIO Executive Trinity:
A mathematics support program for first year engineering students 1 School of Mathematics and Physics The University of Queensland 23 June 2014.
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
Chris Evans, University of Winchester Dr Paul Redford, UWE Chris Evans, University of Winchester Dr Paul Redford, UWE Self-Efficacy and Academic Performance:
Transition, Engagement and Retention of First Year Computing Students Heather Sayers Mairin Nicell Anne Hinds.
XYZ Middle School School Counseling and Guidance Program Classroom Guidance Promotion Retention (Results sample) Hatching Results® (2010)
“The Effect of Patient Complexity on Treatment Outcomes for Patients Enrolled in an Integrated Depression Treatment Program- a Pilot Study” Ryan Miller,
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
MBP1010H – Lecture 4: March 26, Multiple regression 2.Survival analysis Reading: Introduction to the Practice of Statistics: Chapters 2, 10 and 11.
Strengths Challenges  Helped to organize thoughts and activities  Gave better direction  Opportunity to share with fellow counselors and county officials.
SGP Logic Model Provides a method that enables SGP data to contribute to performance evaluation when data are missing. The distribution of SGP data combined.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 1 Mining LMS data to develop an early warning system for educators : A proof of concept Presenter : Wu,
Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Integration of Embedded Lead Tutors Abstract In a collaboration between the Pirate Tutoring Center and several faculty members on campus, we have implemented.
Year 10 GCSE Information Evening WELCOME. Year 10 GCSE Information Evening Update on national changes to GCSEs – Jennifer Howe, Assistant Headteacher.
WHAT DOES SUCCESS MEAN TO ME? Think of three things that would make you feel successful on the last day of high school? What actions are needed to make.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Boris Milašinović Faculty of Electrical Engineering and Computing University of Zagreb, Croatia.
Early Identification of Introductory Major's Biology Students for Inclusion in an Academic Support Program BETHANY V. BOWLING and E. DAVID THOMPSON Department.
Developing Effective Statewide Kindergarten Assessments with the Work Sampling System Developing Effective Statewide Kindergarten Assessments with the.
E ARLY W ARNING I NDICATOR S YSTEM Jessica Noble Education Program Consultant, KSDE
Taeho Yu, Ph.D. Ana R. Abad-Jorge, Ed.D., M.S., RDN Kevin Lucey, M.M. Examining the Relationships Between Level of Students’ Perceived Presence and Academic.
CMS SAS Users Group Conference Learn more about THE POWER TO KNOW ® October 17, 2011 Medicare Payment Standardization Modeling using SAS Enterprise Miner.
Starfish Training Minot State University
Welcome New Freshmen! Registration Tutorial ( )
Introductions Office of Completion and Student Success
Proper registration: Credit for your students and $ for the college
Scott L Massey PhD PA-C Slippery Rock University USA
Bridges To Success “Effective Advising in Guided Pathways: Executing advising plans that transform departments and institutions to help students achieve.
Presented by Khawar Shakeel
General Education Revitalization: Where we are
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Addressing Curricular Barriers to Completion
Struggling Students & CSU Programs
Welcome New Freshmen! Registration Tutorial ( )
Eastern Michigan University
First Year in Maths Workshop 4 – 11 July 2016
Analytics in Higher Education: Methods Overview
The Good The Bad & The Ugly Real-Life Examples of the SLO Assessment Report Form With Tips on How to Complete It August 21, 2012.
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
Helen Zaikina-Montgomery, Ph.D. Scott Burrus, Ph.D.
Progress Reports, Alerts, and Cases in Compass
Standards Aligned System Integration Trainer Professional Development
Topic: Building a Disability Management System
School of Psychology, University of Aberdeen
Course Evaluation Ad-Hoc Committee Recommendations
EDUCATION PERIOD REGULAR PERIOD MAXIMUM PERIOD 1 YEAR 2 YEARS 4 YEARS
Early identification of online students at risk of failing
Selecting Baseline Data and Establishing Targets for Student Achievement Objectives Module Welcome to the Polk County Selecting Baseline Data and.
Student Equity Planning August 28, rd Meeting
Class Rosters-Staff and Students linked through Courses
Athens Technical College
The issue of retention in distance learning: a strategic approach to using data to inform retention interventions Alison Gilmour and Avinash Boroowa The.
ROSE STATE COLLEGE   “CLICK”  Community Learning in Critical Knowledge
DVD Back to School Night 2014
[Group Name].
Student Learning Objectives (slos)
Matthew R. DuBois, Ph.D, NCSP Public Schools of Brookline
Presentation transcript:

Developing an early warning system combined with dynamic LMS data Danny Green and Le Van Anh RMIT University, Vietnam

Outline Background and Rationale The EWS The General Statistical Model The Pilot Combining dynamic LMS Data All first year courses Conclusions Future Research 01/06/2017 EMS in Vietnam

Background and Rationale RMIT Vietnam is an unique context Lack of institutional research in this area Existing at-risk system has drawbacks Based on fails vs course load Referrals to support services happening too late in the semester Doesn’t involve the teacher 01/06/2017 EMS in Vietnam

What we wanted to know What factors influenced achievement? How accurate could the statistics be? What % of low scores could we predict? What impact would the model have on achievement? 01/06/2017 EMS in Vietnam

Focus lists distributed to relevant teaching teams Students on focus lists contacted by their teacher and invited to a meeting Statistical model applied to current enrolments Focus lists uploaded to the LMS Teaching teams feedback on the system Week 0 Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Teaching teams select meaningful triggers based on LMS behaviour, early assessment, feedback mechanisms or teacher observations Teaching teams meet with students and refer to academic support services where appropriate LMS configured to generate focus lists that combine the statistical prediction with dynamic data from the LMS and/or the classroom

The General Model n = 78 000 20 variables Multiple linear regression and Bayesian Model Averaging (BMA) R-squared of 0.6815 88% of the students who achieved low scores ( ≤ 55) were predicted by the model’s “focus-list”. Key factors related to past academic performance High school grade averages, English level, the amount of credit points achieved in the program and previous academic performance 27/04/2016 Example Presentation

The Pilot Three first-year, high enrolment, core business courses ( n =1009) Researchers then refined the model to be specifically relevant to the chosen courses A training dataset was then created (ratio = 7:3) and BMA was applied to ascertain the most significant predictors of final grades Multiple Linear regression was then applied using these predictors to develop a predictive model. multiple R-squared of 0.416 (r ~ 0.65) average high school score, English proficiency, credit points achieved, current GPA, whether they were repeating the course, the number of student advisement notes and their at-risk status.

Pilot Model Volunteers from the teaching teams - 10 teachers requested more detailed feedback from teachers Researched internal triggers and consulted to find good intervention moments Used smart lists in blackboard Composed template emails - designed as positive nudges (including consultation with health and well being)

LMS Triggers

Pilot Model Accuracy 63.22% positive predictions across the three courses 71% of all scores within a 10% error. Average points difference of 8.94 49 of the 110 fails were on the lists Combined with data coming from the LMS the prediction accuracy increased

Model and LMS evaluation As we can see the model for business management was predicting 37% to begin with actually better than the Triggers on their own within the cause but then the real Valley comes when we can bond to model with the giả 10 to 56% laundry or 100% by the time I get to light in the closet Trigger 2 extremely accurate and 70%

Teacher feedback Teacher feedback uptake: 0%

All first year courses S1 2017 n = 4329 Focus list 547 ( reduced to 462) 93 teachers and 48 courses Information sessions held with all relevant teaching teams Successfully had feedback added as an academic KPI Simplified feedback - 3 yes/no drop downs Internal Validation = 71% good prediction, 7.77 average points difference, positive predictive value 73.5%

Actual first year results Average Point Differences 7.77759009 Percentage of 'good prediction' 71%

Focus List records 51.63% Positive Predictive Value out of the 496 records on the focus lits, 142 did not successfully pass their course 97 records had a PS >=55 and an AS of >=55, were not contacted, met or referred by the teacher Focus list predicted 32% of actual fails across all first year courses 51.63% of the predicted 555 or less scores did score 55 or less

Results of students the model predicted would achieve <=55 Focus list records Results of students the model predicted would achieve <=55 of the 153 passes - 104 were 55 or less half of the HDs and one third of the DI were from Bus Stats - traditionally one of our hardest courses but it has undergone dramatic change from being exam based to group projects

Interventions

Predicted vs Actual results against intervention factors Significant results = met lecturer + 3.13, lec referred -5.24, contacted and referred - -5.24 73 records were referred to SAS, only 3 came

Teacher feedback % of teachers who participated (gave feedback at the end): 75.27% but in reality - there is still *but in reality, still have 183 records with no feedback

Conclusions Statistical model already providing a powerful tool for teachers It is only a tool, and needs to be combined with in course data to be meaningful Further research, modeling and consultation and collaboration with stake holders necessary

The Future Research and Data working with admissions to obtain missing data geographical, family and socioeconomic status data work to achieve greater buy in from faculty machine learning - neural networking LMS behaviour Permanent integration into the Student Record System

Developing an early warning system combined with dynamic LMS data Danny Green and Le Van Anh RMIT University, Vietnam