Presentation is loading. Please wait.

Presentation is loading. Please wait.

Taking Learning Analytics Research to Scale: Lessons & Future Directions John Whitmer, Ed.D. Director, Analytics & Research john.whitmer@blackboard.com.

Similar presentations


Presentation on theme: "Taking Learning Analytics Research to Scale: Lessons & Future Directions John Whitmer, Ed.D. Director, Analytics & Research john.whitmer@blackboard.com."— Presentation transcript:

1 Taking Learning Analytics Research to Scale: Lessons & Future Directions
John Whitmer, Ed.D. Director, Analytics & Research

2 Meta- questions driving our Learning Analytics research @ Blackboard
1. How is student/faculty use of Bb platforms (e.g. Learn, Collab, etc.) related to student achievement? [or satisfaction, or risk, or …] 2. Do these findings apply equally to students ‘at promise’ due to their academic achievement or background characteristics? (e.g. race, class, family education, geography) 3. What data elements, feature sets, and functionality can we create to integrate these findings into Bb products to help faculty improve student achievement?

3 Outline Campus Research
Intro to Religious Chico State (Single Course) Learning Analytics San Diego State (Multiple Courses) Blackboard Research Relationship LMS Use & Student Achievement (Grade) Relationship Tool Use in Courses & Student Achievement Course Design Category Archetypes (publishing 10/27) Discussion

4 Campus Research Findings

5 Study 1: Understanding Low Outcomes in Redesigned Course (Chico State)

6 Study Overview Course redesigned for hybrid delivery in year-long program 54 F’s Enrollment: 373 students (54% increase largest section) Highest LMS usage entire campus Fall 2010 (>250k hits) Bimodal outcomes: 10% increased SLO mastery 7% & 11% increase in DWF Why? Can’t tell with aggregated reporting data

7 Grades Significantly Related to LMS Use (Proxy for effort?)
Variable % Variance Total Hits 23% Assessment activity hits 22% Content activity hits 17% Engagement activity hits 16% Administrative activity hits 12% Mean value all significant variables 18% Course: “Introduction to Religious Studies” CSU Chico, Fall 2013 (n=373)

8 LMS Activity better Predictor than ANY Demographic/Educational Variables (even HS GPA!)
9% URM and Pell-Eligibility Interaction 7% Under-Represented Minority 4% Enrollment Status 3% URM and Gender Interaction 2% Pell Eligible First in Family to Attend College 1% Mean value all significant variables Not Statistically Significant Gender Major-College

9 At-Promise Students (Race & Class): “Over-working gap”

10 Study 2: Creating Accurate Learning Analytics Triggers & Effective Interventions (SDSU)

11 Study Overview President-level initiative
Goal: identify effective interventions driven by Learning Analytics “triggers” Multiple “triggers” (e.g., LMS access, Grade, Online Homework/Quiz, Clicker use) At scale & over time: conducted for 5 semesters, 9 courses, ~ 10,000 students “Gold standard” experimental design (control / treatment) 2013 SP – Study began with two courses (N=`2,000) in Spring 2014 Weekly reports; triggered students sent and multimedia “interventions” (low/high intensity) Our hypothesis was - as you will see as the number of trigger events increase, so would the likelihood of having to repeat the course

12 Study Protocol Identify courses and recruit instructors
Prior to course start, review syllabus, schedule meaningful “triggers” for each course (e.g. attendance, graded items, Blackboard use, etc.) Run reports in Blackboard, Online Homework/Quiz software to identify students with low activity or performance (~ weekly) Send “flagged” student in experimental group a notification/intervention Aggregate data, add demographic data. Analyze. James Weekly reports; triggered students sent “interventions” (low intensity). Did people show up i.e., Clicker (participation/attendance points) 12

13 Frequency of interventions (Spring 2015)
Talking points: Almost ¾ of students got at least one trigger in each course More PSY students got interventions than Stat students (b/c not completing homework) The pattern of the # of interventions in both courses is about the same – high up to 2-3, then trails off. Interesting findings – when consider that the triggers were very different between courses (e.g. PSY only 2 graded items, PSY: Online Homework, Stat: Online Quizzes. Etc).

14 Poll question Did triggers predict achievement? What level significance? How much variation in student grade was explained? A Not significant B <10%, significant .05 level C 20%, significant .01 level D 30%, significant .001 level E 50%+, significant level John 14

15 Poll question Did triggers predict achievement? What level significance? How much variation in student grade was explained? A Not significant B <10%, significant .05 level C 20%, significant .01 level D 30%, significant .001 level E 50%+, significant level (Spring 2014, Fall 2014) John 15

16 Spring 2015: Greater Variety of Courses, Greater Range in Results
JW PRESENT (Check time and quicken pace if necessary please)

17 Behavioral Data Predictions Overtake Demographic Data in Week 2
JW PRESENT (Check time and quicken pace if necessary please)

18 Blackboard Research Findings

19 Blackboard’s Learning Data Footprint (2015 #’s)
1.6M Unique Courses 40M Course Content Items Blackboard Learn = ¼ total data 4M Unique Students 775M LMS Sessions Blackboard is the largest and most experienced education technology company in the world We are a trusted partner to top institutions from around the world We have thousands of customers and users across all markets and around the world

20 Commitment to Privacy & Openness
Analyze data records that are not only removed of PII, but de-personalized (individual & institutional levels) Respect territorial jurisdictions and safe harbor provisions Share results and open discussion procedures for analysis to inform broader educational community

21 Study 1: Relationship Student Time & Grade
LMS data is messy and lossy Adoption is hugely varied in terms of tools used And more important, the learning materials/activities within those tools are tremendously varied

22 Findings: Relationship Time in LMS & Grade
1.2M students 34,519 courses 788 institutions Overall effect size < 1%

23 But strong effect in some courses (n=7,648, 22%)

24 Study 2: Tool Use & Student Grade
To determine the most important tools in Bb Learn, by observing: Tools that are used the most (in minutes, for instance) Tools that have strongest relationship with final grade Tools that are ‘underused’ the most (by learners & instructors); tools that have the greatest potential to improve learning outcomes Allows us to see which tools educate students, and are therefore useful Reinforce the educational impact of the Blackboard Learn platform

25 Data Filtering Class Size Activity Rates Grade Distribution
between 10 and 500 students Activity Rates over 1 hour online as a course average Grade Distribution average grade between 40% & 95% Filters decreased the number of students analyzed from 3.37 million users in 70,000 courses from 927 institutions to 601,544 users (17% of total) in 18,810 courses (26.8% of total) from 663 institutions (71.5% of total)

26 Finding: Tool Use & Grade
Tool use and Final Grade do not have a linear relationship; there is a diminishing marginal effect of tool use on Final Grade Interpretations Students absent from course activity are at greatest risk of low achievement. The first time you read/see a PowerPoint presentation, you learn a lot, but the second time you read/see it, you learn less. Getting from a 90% to a 95% requires more effort than getting from a 60% to a 65%.

27 Log transformation shows stronger trend
Finding: Tool Use & Grade Log transformation shows stronger trend Tool use and Final Grade do not have a linear relationship; there is a diminishing marginal effect of tool use on Final Grade Interpretations Students absent from course activity are at greatest risk of low achievement. The first time you read/see a PowerPoint presentation, you learn a lot, but the second time you read/see it, you learn less. Getting from a 90% to a 95% requires more effort than getting from a 60% to a 65%.

28 Investigation Achievement by Specific Tools Used
Analysis Steps Identify most frequently used tools Separate tool use into no use + quartiles Divide students into 3 groups by course grade High (80+) Passing (60-79) Low/Failing (0-59)

29 Finding: MyGrades At every level, probability of higher grade increases with increased use. Causal? Probably not. Good indicator? Absolutely.

30 Finding: Course contents
More is not always better. Large jump none to some; then no relationship

31 Finding: Assessments/Assignments
Students above mean have lower likelihood of achieving a high grade than students below the mean

32 Next Steps & Discussion

33 John Whitmer, Ed.D. john.whitmer@blackboard.com @johncwhitmer
Questions? John Whitmer, Ed.D. @johncwhitmer


Download ppt "Taking Learning Analytics Research to Scale: Lessons & Future Directions John Whitmer, Ed.D. Director, Analytics & Research john.whitmer@blackboard.com."

Similar presentations


Ads by Google