Download presentation
Presentation is loading. Please wait.
Published byReynold Ford Modified over 7 years ago
1
Using Analytics to Intervene with Underperforming College Students
Kimberly Arnold (Purdue University) John Fritz (University of Maryland, Baltimore County) Eric Kunnen (Grand Rapids Community College) January 20, 2010
2
OVERVIEW Analytics 101 Five Minutes of Fame
Purdue University’s “Signals” UMBC’s “Check My Activity” (CMA) GRCC’s & Seneca College’s Project ASTRO More Demos (time permitting) Q&A
3
ANALYTICS 101
4
WHAT IF . . . Can performance and/or backgrounds of past students predict success of future students? How would we know? If so, how would we communicate (intervene?) with students? With teachers? How would this change teaching & learning?
5
CLASSROOM WALLS THAT TALK
Course or Learning Management Systems are NOT just content delivery or interactive learning environments. The record or “residue” of online learning is a potentially rich data source that needs to be studied further. How are schools thinking about this?
6
Five Stages of Analytics on Campus
1. Extraction and reporting of transaction-level data 2. Analysis and monitoring of operational performance 3. What-if decision support (such as scenario building) 4. Predictive modeling and simulation 5. Automatic triggers of business processes (such as alerts) ECAR Study: Academic Analytics: The Uses of Management Information and Technology in Higher Education – (2005).
7
FIVE MINUTES OF FAME
8
PURDUE’S “SIGNALS”
10
Specific and customizable interventions
SIS data (historic) CMS and other technologies (real time) Other data Prediction SIS data= admissions data such as HS GPA, HS rank, highest science taken in HS, SAT CMS data= Time spent in content files, time in discussion boards, time spent doing practice quizzes etc. and other technologies such as level of engagement from Hotseat, online tutor session, blogs, wikis etc. Other data= help seeking behavior (office hours, help centers), work study status, on campus/off campus ------ By combining all the data and building various predicative models, we can customize the algorithms. Institution level (PU students are different from UMBC, UMBC students are different that GRCC etc.) College level (ENGR students differ from LA students so a. data points are different b. different data elements are more predictive for certain college/programs) Course level (each course has different characteristic—some for majors, some survey level, inst differences with TA etc. different level of “success” as defined by instructor Specific and customizable interventions
11
INTERVENTIONS Customizable Real-time Specific Actionable
Customizable—each instructor creates their own intervention based on the multiple data points Real-time this can be updated instantly Specific—faculty AND students know exactly where they should be focusing on (ie you are not utilizing the practice quizzes as much as your peers and you are not going to online tutoring as recommended, please try—we want you to be successful) actionalble—tell the students EXACTLT what to do
12
5 MINUTE OF FAME RESULTS More Bs and Cs Less Ds and Fs
Students are getting more help, earlier and more frequently Faculty like that they can give feedback to large courses (150-1,200) Students Direct contact with faculty Motivation 60% say they got a better grade Academic analytics requires patience! We have been doing research for 7 years. When you are talking about wrangling all these data sources—some dynamic, some static—it takes time! And we have to be mindful of privacy issues as well.
13
UMBC’S “CHECK MY ACTIVITY” (CMA)
17
CURRENT VERSION
19
FUTURE VERSION
20
UMBC BLACKBOARD ACTIVITY BY GRADE DISTRIBUTION (2007-2009)
SEMESTER COURSES D/F Avg >=C Avg % Diff FA2009 29 189.50 302.33 37.32 SU2009 9 212.00 275.67 23.10 SP2009 11 92.50 175.00 47.14 FA2008 13 101.50 166.33 38.98 SU2008 7 65.00 167.33 61.16 SP2008 26 136.00 199.67 31.89 FA2007 15 135.00 211.33 36.12 110 112.29 213.95 39.39
21
WHAT DOES THIS LOOK LIKE?
22
FA2008 SCI100 Findings How would you describe the CMA’s view of your Bb activity compared to your peers? 28% “I was surprised by what it showed me” 12% “It confirmed what I already knew” 42% “I’d have to use it more to determine its usefulness” 16% “I haven’t used it.” 2% did not respond to this question
23
FA2008 SCI100 Findings If your instructor published a GDR for past assignments, would you be more or less inclined to use the CMA before future assignments are due? 54% “More inclined” 10% “Less inclined 36% “Not sure”
24
FA2009 STUDENT USE
25
GRADE DISTRIBUTION Part of our public Blackboard Reports, run after the last day of classes every semester. Final GDR run after final grades are submitted. Faculty “opt in” by including a final letter grade in their Bb grade book with the column heading “GRADE.”
26
"Project ASTRO" Blackboard Greenhouse Grant
Eric Kunnen Coordinator of Instructional Technologies Grand Rapids Community College Santo Nucifora Manager of Systems Development and Innovation Seneca College
27
Overview Evaluating Blackboard Use on Your Campus A review of the "Project ASTRO" Greenhouse Grant - Session Description: Collecting and reporting on system activity information from Blackboard is often a challenge. Come and learn how to easily access reports on how Blackboard is being used by faculty, staff, and students which will help you: inform stakeholders, improve the engagement of end users, increase adoption, and to encourage deeper use of Blackboard by faculty, staff, and students. Key Functions of Project ASTRO: Tracking Automatic tracking (via Building Block) of courses, organizations, users, and tools. Reporting Easy point and click access to advanced reports. Discovery Ability to measure trends and analyze usage. Sharing Inform key stakeholders of usage levels. Acting Identify, target, and engage users using reports. 1) The Big Idea = Evaluating and measuring the use of Blackboard on your campus is becoming more and more important. With budget cuts and challenges… along with the need to continue to transform education by leveraging technology and the tools therein. 2) Focus of Presentation = Will be to review Project ASTRO and it’s capabilities. 3) Themes = Tracking/Reporting/Discovery/Sharing/Acting How many faculty and students are using Bb? What tools are used in our system? How can we improve and promote new tools? Who do we need to communicate with for the next upgrade? Who are our innovative faculty? How does faculty use compare to student use? Which departments are using Bb and to what degree? How can we use these data to encourage deeper use and create awareness of Bb? What is the impact of Bb in teaching and learning? Goal - To build a community based advanced reporting tool and Blackboard Building Block that will empower institutions to become more accountable and able to use data-driven decision making to enhance, optimize, and advance Blackboard Academic Suite usage in teaching and learning. Awarded in 2007 Code Name = Project ASTRO
28
The Potential of Reporting
Usage Statistics – Ability to monitor and better determine which parts of the system are the most frequently used and less frequently used, thereby providing targeted promotion and training of specific tools or features that would benefit both faculty and students. (Student/Faculty Engagement, Adoption, Retention, and Satisfaction) Accountability – Ability to provide metrics and trend data that can be obtained from the system to determine accurate statistics on usage for stakeholders requiring data-driven decisions and measures. (Department Action Plans, Continuous Quality Improvement) Planning - Data obtained will enable institutions to better prepare and optimize system configuration, change management, manage upgrades, monitor performance, rolling out new tools, communication of issues, and investigate infrastructure hardware purchases for future growth. (Upgrade Management and System Growth) Return on Investment - Data gathered can help determine and allow for the maximization of Blackboard use by faculty, staff, and students. (ROI and Accountability)
29
Dashboard At a Glance Views
Active vs Inactive Courses including Departments Top Courses and Organizations for Week Top Tools Used by Instructors and Students Top Portal Modules Used
30
Active Courses Department vs Sub Department
Courses, Percentage, Instructors, Students Semester Trends
31
Activity in Active Courses
Page Views (Hits) by Instructor and Student Courses with Page Views vs Total Page Views Break Down by Day, Week, Month, All
32
Top Week Tool Page Views
Student vs Instructor
33
Course Tool Items Building Block Tool Tracking
Drill Down by Courses and Instructor Using Tools Trends
34
Activity By Course
35
Activity by User
36
GRCC – STARFISH EARLY ALERT
Identify & Detect Manual Flags Automatic Flags Attendance Intervene & Track Instructor Advisor Groups of Courses and Students Improve & Retain Student Communication and 360 Close Loop More info: Identify = Detect It costs more to recruit students that it does to retain them.
37
GRCC - Starfish Example Instructor Manually Raises Flag
The instructor can select one or more students from the student list and manually raise a flag on the student. When raising a flag, the instructor writes a description of the flagged issue. This information is forwarded to someone who can help the student, as determined by the flag rule setup by the administrator.
38
GRCC - STARFISH EXAMPLE INSTRUCTOR RESPONDS TO A “FLAG SURVEY” EMAIL
Administrators can survey requests to instructors. Clicking on the request takes the instructor to a flag survey where they are prompted to flag their students if they are experiencing any specified problems.
39
GRCC - STARFISH EXAMPLE AUTOMATIC FLAGS BASED ON BLACKBOARD GRADEBOOK/COURSE ACCESS
Administrators can set up flags to be raised that are auto-generated. Flags can be raised by the system by grades and average scores and specific gradebook columns in Blackboard. Flags can also be raised based on students’ access to their courses in Blackboard. Additional customization is available through API’s.
40
MORE INFORMATION Purdue University Signals Project Site UMBC’s Blackboard Reports & CMA GRCC & Seneca College - Project ASTRO GRCC Starfish Early Alert Project Site
41
THANK YOU Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.