Taking Learning Analytics Research to Scale: Lessons & Future Directions John Whitmer, Ed.D. Director, Analytics & Research john.whitmer@blackboard.com.

Slides:



Advertisements
Similar presentations
Hybrid Statistics Clicks Janet Winter Penn State Berks.
Advertisements

Supplemental Instruction & Tutoring Center for Student Achievement January 16, 2013.
Genre Shift: Instructor Presence and its Impact on Student Satisfaction in Online Learning.
1 Predicting Success and Risk: Multi-spell Analyses of Student Graduation, Departure and Return Roy Mathew Director Center for Institutional Evaluation.
DIXIE STATE UNIVERSITY FACULTY IMPACT ON RETENTION David Roos, Ed.D. Executive Director, Enrollment Management *
1 BA 555 Practical Business Analysis Review of Statistics Confidence Interval Estimation Hypothesis Testing Linear Regression Analysis Introduction Case.
© Arizona State University Data Based Decision Making November 2013.
Understanding our First Years Two studies and a comparison.
NUMBERS ARE NOT ENOUGH. WHY E- LEARNING ANALYTICS FAILED TO INFORM AN INSTITUTIONAL STRATEGIC PLAN Presented by: Sajana Meera.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
Impact of Moodle usage practices on students’ performance in the context of a blended learning environment Filippidi Andromahi,
Redesign of Beginning and Intermediate Algebra Lessons Learned Cheryl J. McAllister Laurie W. Overmann Pradeep Singh Southeast Missouri State University.
Jennifer P. Hodges, Ph.D. Bucking the Trend: Balancing Work, Family, Commuting, and Academics.
Evaluating a Literacy Curriculum for Adolescents: Results from Three Sites of the First Year of Striving Readers Eastern Evaluation Research Society Conference.
Monroe Community College Practices to Retain Students in Online Learning Dr. Jeffrey P. Bartkovich Marie J. Fetzner Monroe Community College May 11, 2004.
Using Technology to Enhance Instruction. Educational Technologies Blackboard, Content- Based Tools Distribution Tools Communicatio n Tools Presentatio.
A statistical method for testing whether two or more dependent variable means are equal (i.e., the probability that any differences in means across several.
College Algebra: An Overview of Program Change Dr. Laura J. Pyzdrowski Dr. Anthony S. Pyzdrowski Dr. Melanie Butler Vennessa Walker.
What Was Learned from a Second Year of Implementation IES Research Conference Washington, DC June 8, 2009 William Corrin, Senior Research Associate MDRC.
Noel-Levitz Student Satisfaction Survey of Classroom and Online Students Conducted Spring 2008.
Online Course Evaluations Is there a perfect time? Presenters: Cassandra Jones, Ph.D., Director of Assessment Michael Anuszkiewicz, Research Associate.
John Whitmer Updated: Research Findings Logging on for Higher Achievement Research.
2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal.
An Analysis of Successful Online Behaviors Across Disciplines Catherine Finnegan, University System of Georgia Libby V. Morris, University of Georgia Kangjoo.
Integration of Embedded Lead Tutors Abstract In a collaboration between the Pirate Tutoring Center and several faculty members on campus, we have implemented.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
Understanding Data & Instructional Decision Making
Pedagogical Standards and Sustainable Distance Education Programming Karen Gersten Associate Provost for Academic Programs and Faculty Development Laura.
Limit collection of categorical data Age – – – – – & Above Income ,000 10,001 – 25,000 25,001 – 35,000.
Identifying At-Risk Students With Two- Phased Regression Models Jing Wang-Dahlback, Director of Institutional Research Jonathan Shiveley, Research Analyst.
Center for Institutional Effectiveness LaMont Rouse, Ph.D. Fall 2015.
+. WHAT PAR IS:  Predictive modeling to determine student attrition risk  Visual presentation of commonly defined data to allow for benchmarking to.
How To Build An Assessment And Impact Model Dr. Suzan Harkness
Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks
Leveraging the Power of Analytics: From Concept to Implementation
Quest/SmarterMeasure™
John Whitmer, Ed.D. Kathy Fernandes Educause 2013 Annual Meeting
Developing an early warning system combined with dynamic LMS data
Jackson College CCSSE & CCFSSE Findings Community College Survey of Student Engagement Community College Faculty Survey of Student Engagement Administered:
(includes online “demo” video)
Improving Student Engagement Through Audience Response Systems
Jenn Shinaberger Corey Lee Lee Shinaberger Coastal Carolina University
Pace’s Inaugural Retention Conference June 16, 2017
High-Impact Practice: Mandatory New Student Orientation
Using Open Educational Resources (OER) to Improve Student Success
NSSE Results for Faculty
Evaluation of a Multi-Site, Out-of-School-Time Program: Contextual, Individual, and Combined Influences on Outcomes Amy Corron, United Way of Greater Houston.
Partnership Data Collection Manual
Analytics in Higher Education: Methods Overview
2017 National Survey of Student Engagement (NSSE)
Sr. Vice President, Student Success
Civitas And Illume Sept. 21, 2014.
Student Data Analytics
Redesign of OPRE 202: Statistical Data Analysis
Your Institutional Report Step by Step
Assessment Day 2017 New Student Experience Presented by Jenny Lee
Propensity Score Matching Makes Program Evaluation Easy
University Senate Presentation
Keeping Students on Track Using Technological Retention Tools
Background This slide should be removed from the deck once the template is updated. During the 2018 Legislative Session, Act 555 was passed requiring schools.
Dion van Zyl & Hanlie Liebenberg
Course Lab Introduction to IBM Watson Analytics
First 5 Sonoma County Triple P Implementation & Evaluation
Dion van Zyl & Hanlie Liebenberg
John Symons, Department of Philosophy
ALEKS & College Algebra - A Journey to Finding the Best Model:
Impact on Student Experience and Outcomes
Assessment Day 2017 New Student Experience Presented by Jenny Lee
Background This slide should be removed from the deck once the template is updated. During the 2019 Legislative Session, the Legislature updated a the.
USG Dual Enrollment Data and Trends
Presentation transcript:

Taking Learning Analytics Research to Scale: Lessons & Future Directions John Whitmer, Ed.D. Director, Analytics & Research john.whitmer@blackboard.com | @johncwhitmer

Meta- questions driving our Learning Analytics research @ Blackboard 1. How is student/faculty use of Bb platforms (e.g. Learn, Collab, etc.) related to student achievement? [or satisfaction, or risk, or …] 2. Do these findings apply equally to students ‘at promise’ due to their academic achievement or background characteristics? (e.g. race, class, family education, geography) 3. What data elements, feature sets, and functionality can we create to integrate these findings into Bb products to help faculty improve student achievement?

Outline Campus Research Intro to Religious Studies @ Chico State (Single Course) Learning Analytics Interventions @ San Diego State (Multiple Courses) Blackboard Research Relationship LMS Use & Student Achievement (Grade) Relationship Tool Use in Courses & Student Achievement Course Design Category Archetypes (publishing 10/27) Discussion

Campus Research Findings

Study 1: Understanding Low Outcomes in Redesigned Course (Chico State)

Study Overview Course redesigned for hybrid delivery in year-long program 54 F’s Enrollment: 373 students (54% increase largest section) Highest LMS usage entire campus Fall 2010 (>250k hits) Bimodal outcomes: 10% increased SLO mastery 7% & 11% increase in DWF Why? Can’t tell with aggregated reporting data

Grades Significantly Related to LMS Use (Proxy for effort?) Variable % Variance Total Hits 23% Assessment activity hits 22% Content activity hits 17% Engagement activity hits 16% Administrative activity hits 12% Mean value all significant variables 18% Course: “Introduction to Religious Studies” CSU Chico, Fall 2013 (n=373)

LMS Activity better Predictor than ANY Demographic/Educational Variables (even HS GPA!) 9% URM and Pell-Eligibility Interaction 7% Under-Represented Minority 4% Enrollment Status 3% URM and Gender Interaction 2% Pell Eligible First in Family to Attend College 1% Mean value all significant variables   Not Statistically Significant Gender Major-College

At-Promise Students (Race & Class): “Over-working gap”

Study 2: Creating Accurate Learning Analytics Triggers & Effective Interventions (SDSU)

Study Overview President-level initiative Goal: identify effective interventions driven by Learning Analytics “triggers” Multiple “triggers” (e.g., LMS access, Grade, Online Homework/Quiz, Clicker use) At scale & over time: conducted for 5 semesters, 9 courses, ~ 10,000 students “Gold standard” experimental design (control / treatment) 2013 SP – Study began with two courses (N=`2,000) in Spring 2014 Weekly reports; triggered students sent email and multimedia “interventions” (low/high intensity) Our hypothesis was - as you will see as the number of trigger events increase, so would the likelihood of having to repeat the course

Study Protocol Identify courses and recruit instructors Prior to course start, review syllabus, schedule meaningful “triggers” for each course (e.g. attendance, graded items, Blackboard use, etc.) Run reports in Blackboard, Online Homework/Quiz software to identify students with low activity or performance (~ weekly) Send “flagged” student in experimental group a notification/intervention Aggregate data, add demographic data. Analyze. James Weekly reports; triggered students sent email “interventions” (low intensity). Did people show up i.e., Clicker (participation/attendance points) 12

Frequency of interventions (Spring 2015) Talking points: Almost ¾ of students got at least one trigger in each course More PSY students got interventions than Stat students (b/c not completing homework) The pattern of the # of interventions in both courses is about the same – high up to 2-3, then trails off. Interesting findings – when consider that the triggers were very different between courses (e.g. PSY only 2 graded items, PSY: Online Homework, Stat: Online Quizzes. Etc).

Poll question Did triggers predict achievement? What level significance? How much variation in student grade was explained? A Not significant B <10%, significant .05 level C 20%, significant .01 level D 30%, significant .001 level E 50%+, significant .0001 level John 14

Poll question Did triggers predict achievement? What level significance? How much variation in student grade was explained? A Not significant B <10%, significant .05 level C 20%, significant .01 level D 30%, significant .001 level E 50%+, significant .0001 level (Spring 2014, Fall 2014) John 15

Spring 2015: Greater Variety of Courses, Greater Range in Results JW PRESENT (Check time and quicken pace if necessary please)

Behavioral Data Predictions Overtake Demographic Data in Week 2 JW PRESENT (Check time and quicken pace if necessary please)

Blackboard Research Findings

Blackboard’s Learning Data Footprint (2015 #’s) 1.6M Unique Courses 40M Course Content Items Blackboard Learn = ¼ total data 4M Unique Students 775M LMS Sessions Blackboard is the largest and most experienced education technology company in the world We are a trusted partner to top institutions from around the world We have thousands of customers and users across all markets and around the world

Commitment to Privacy & Openness Analyze data records that are not only removed of PII, but de-personalized (individual & institutional levels) Respect territorial jurisdictions and safe harbor provisions Share results and open discussion procedures for analysis to inform broader educational community

Study 1: Relationship Student Time & Grade LMS data is messy and lossy Adoption is hugely varied in terms of tools used And more important, the learning materials/activities within those tools are tremendously varied

Findings: Relationship Time in LMS & Grade 1.2M students 34,519 courses 788 institutions Overall effect size < 1%

But strong effect in some courses (n=7,648, 22%)

Study 2: Tool Use & Student Grade To determine the most important tools in Bb Learn, by observing: Tools that are used the most (in minutes, for instance) Tools that have strongest relationship with final grade Tools that are ‘underused’ the most (by learners & instructors); tools that have the greatest potential to improve learning outcomes Allows us to see which tools educate students, and are therefore useful Reinforce the educational impact of the Blackboard Learn platform

Data Filtering Class Size Activity Rates Grade Distribution between 10 and 500 students Activity Rates over 1 hour online as a course average Grade Distribution average grade between 40% & 95% Filters decreased the number of students analyzed from 3.37 million users in 70,000 courses from 927 institutions to 601,544 users (17% of total) in 18,810 courses (26.8% of total) from 663 institutions (71.5% of total)

Finding: Tool Use & Grade Tool use and Final Grade do not have a linear relationship; there is a diminishing marginal effect of tool use on Final Grade Interpretations Students absent from course activity are at greatest risk of low achievement. The first time you read/see a PowerPoint presentation, you learn a lot, but the second time you read/see it, you learn less. Getting from a 90% to a 95% requires more effort than getting from a 60% to a 65%.

Log transformation shows stronger trend Finding: Tool Use & Grade Log transformation shows stronger trend Tool use and Final Grade do not have a linear relationship; there is a diminishing marginal effect of tool use on Final Grade Interpretations Students absent from course activity are at greatest risk of low achievement. The first time you read/see a PowerPoint presentation, you learn a lot, but the second time you read/see it, you learn less. Getting from a 90% to a 95% requires more effort than getting from a 60% to a 65%.

Investigation Achievement by Specific Tools Used Analysis Steps Identify most frequently used tools Separate tool use into no use + quartiles Divide students into 3 groups by course grade High (80+) Passing (60-79) Low/Failing (0-59)

Finding: MyGrades At every level, probability of higher grade increases with increased use. Causal? Probably not. Good indicator? Absolutely.

Finding: Course contents More is not always better. Large jump none to some; then no relationship

Finding: Assessments/Assignments Students above mean have lower likelihood of achieving a high grade than students below the mean

Next Steps & Discussion

John Whitmer, Ed.D. john.whitmer@blackboard.com @johncwhitmer Questions? John Whitmer, Ed.D. john.whitmer@blackboard.com @johncwhitmer