(includes online “demo” video)

Slides:



Advertisements
Similar presentations
2012 National Survey of Student Engagement Jeremy D. Penn & John D. Hathcoat.
Advertisements

Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Sakai Fall 2008 Sakai Student Survey Based on 100 responses.
Foundational Skills Helping Students Get Ready for College EAP - Working Together to Prepare Students for College Jeff Gold - Director Academic Technology,
Using MyMathLab Features You must already be registered or enrolled in a current MyMathLab class in order to use MyMathLab. If you are not registered or.
Learning Analytics with Blackboard 28 August March 2013 Dan
The Role of Automation in Undergraduate Computer Science Chris Wilcox Colorado State University 3/5/2015.
Gallaudet Institutional Research Report: National Survey of Student Engagement Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty.
Professor Norah Jones Dr. Esyin Chew Social Software for Learning – The Institutional Policy of the University of Glamorgan ICHL 2012, China
Choice and Application of Keypads to Small Classes Paul Williams Department of Physics Austin Community College Austin, Texas
+ Summer Institute for Online Course Development Institute – Assessment Techniques Presentation by Nancy Harris Dept of Computer Science.
Using Snapshot to Manage Enrolments for Sites with Non-Traditional Enrolment Criteria Cathy Colless & Anthony Leonard University of York.
Michael Dillon Office of Institutional Research UMBC
Mining User Data: Getting the Most out of your CMS John Fritz, UMBC.
Developing metrics and predictive algorithms for your institution– Marist Story JISC LEARNING ANALYTICS NETWORK EVENT SANDEEP JAYAPRAKASH.
Can survey data be used as an indicator of learning outcomes? Karl Molden, Senior Planning Analyst Veronika Hulikova, Student Analyst.
EARN Early Alert Retention Network Powered by: UTSA’s Academic Early Alert System.
AP Biology ma.us. Agenda  The Course  The Assignments  Expectations  The Test.
Assessing Online Learning Outcomes
“Cool” Cybersecurity Modules: No Grading Required!
Using Analytics to Intervene with Underperforming College Students
Frosh Graduation Rates
An Amazing Day!.
What Works Conference 2017 Jackie Cooper – Project Manager
Better Informed Academic Planning Using Student Flow Models
PSYC 2301 Introduction to Psychology
A CASE FOR LEVERAGING THE ONLINE GRADE BOOK
How many of you use Blackboard?
Vistas Supersite Information
Kim Arnold, Purdue John Fritz, UMBC October 12, 2010
Providing Access to, Preparation for, and the Experience of Higher Education
ENROLLMENT AND RETENTION
Welcome!! Sign by your name on the papers by the door.
Online Driver Education and Virtual Classroom
ISBN of bookstore bundle
Benefits of Summer Programs on Low-income Elementary Aged Youth
It Takes a Community to Cultivate the Assessment Crop
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Using MyMathLab Features
ACADEMIC ADVISING SESSION
University of Arkansas Information Technology Services
Stoplights for student success
Building Engagement in Course Evaluation
Sr. Vice President, Student Success
Starfish Faculty Training How to Raise Flags and Kudos
Peer Evaluation of Teammates
Your Institutional Report Step by Step
Modular Course Overview Algebra and Trig. I
Modular Course Overview MATH1710, MATH 1720 MATH 1740 and MATH1750
Proactive Referral & Engagement Program
Keeping Students on Track Using Technological Retention Tools
Using MyMathLab Features
What you will need to Register
Course Evaluation Ad-Hoc Committee Recommendations
The Academic Alert System: Fall 2007 Report
Integrating digital technologies: Three cases of visual learning Professor Robert Fitzgerald Charles Darwin University IRU Senior Leaders Forum 2018.
CONCURRENT ENROLLMENT
New employee induction for new staff and managers
A tale of three surveys: How librarians, faculty and students perceive and use electronic resources March 2009 © SkillSoft Corporation 2003.
Welcome to UNC-Chapel Hill
Online Driver Education and Virtual Classroom
Student Overview.
Classroom Walls That Talk
Seven Principles of Good Teaching
Learning Community II Survey
How to Use Learning Analytics in Moodle
USG Dual Enrollment Data and Trends
Choice and Application of Keypads to Small Classes
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

www.umbc.edu/blackboard/reports (includes online “demo” video) Leveraging CMS Usage Data to Raise Awareness of Underperforming Students John Fritz www.umbc.edu/blackboard/reports (includes online “demo” video) Problem Methods Results Conclusions Recently, more institutions have pursued "academic analytics” to shed light on and improve student success. Typically associated with business and marketing—Amazon uses data mining of other people's past purchases to suggest books you might be interested in buying next—academic analytics can be used to profile and even predict students who may be at risk, by comparing their pre-college demographics and performance with past students. However, it is still not clear how best to present the lessons institutions learn from their analytics data to current students who may need to know them-in the form of an intervention. How do we lead them to the water they should drink without raising concerns about privacy or Big Brother watching them? Why shouldn’t they think they are the exceptions to the rules our data models suggest? As John Campbell, a leader in academic analytics has said, what is an institution's "ethical obligation of knowing”? UMBC’s Blackboard Activity Reports Similar to other institutions, UMBC has determined that a relationship may exist between student success as defined by grades and activity in the campus' online course management system (CMS). Specifically, over two academic years (2007-09), UMBC’s “Most Active Blackboard Courses” reports show students earning a D or F in 81 courses used the CMS about 35 percent less than students earning a grade of C or higher. SU2009 (9 courses | 42 percent less) SP2009 (11 Courses | 47 percent less) FA2008 (13 courses | 40 percent less) SU2008 (7 courses | 33 percent less) SP2008 (26 courses | 32 percent less) FA2007 (15 courses | 36 percent less) Key Questions While the sample of courses needs to be expanded and studied further, these student CMS activity data patterns raise some intriguing questions: Does this usage pattern hold true throughout the semester? If so, how might students’ awareness, motivation and performance change if they could know this information sooner? To answer these questions we have created the following “self service” tools students can use to compare their own activity against an anonymous summary of their peers: Check My Activity (CMA) allows students to see how they use the CMS compared to other students in the same course. Grade Distribution Report (GDR) shows students their grade and CMS activity compared to other students in the same course – provided the instructor enters a grade in the grade book. The purpose of these tools is to provide early and frequent system feedback directly to students, so they are the first to know if and how their CMS activity might be an indicator of their engagement in a course. During Fall 2008 and Spring 2009, we studied how students used the CMA and GDR tools in SCI100, a 200-student lab science course for non-majors that is equally enrolled across all class standings (e.g., freshmen to seniors). In addition to confirming a national study showing that students value checking their grades more than any other CMS function, we found the following: 28 percent of SCI100 students who used the CMA tool were “surprised” by their own data. 54 percent said they would be “more inclined” to use the CMA before future assignments are due if they had access to a GDR showing Bb activity for past assignments. Student Use of the CMA & GDR Tools While SCI100 student feedback has been encouraging, adoption by the larger student body has been slow since the tools were announced in Spring 2008, and reported in the student newspaper in Spring 2009. To help, a series of Fall 2009 Blackboard system-wide announcements encouraged students to use the CMA & GDR tools (below). The SCI100 studies and FA2009 Bb System Announcements suggest that students who are not proactively introduced to the CMA and GDR tools may not understand how or why they would use them. Also, since these tools are not part of the delivered Blackboard software, they are not easy to find. They are available as “self service” reports on our main Blackboard Reports site at www.umbc.edu/blackboard/reports. Future Plans Make the CMA & GDR tools easier to find and use by developing a My Activity “Building Block” or direct links from our myUMBC portal (used by 90 percent of undergraduates to access Blackboard) Find out why students return to the CMA & GDR tools and what, if anything, their use can mean for changing student awareness, behavior and academic performance. Encourage other institutions to download, install and test UMBC’s CMA & GDR code. Also, we will collaborate with others who are doing similar work (e.g., Project ASTRO).