OAAI: Deploying an Open Ecosystem for Learning Analytics

Slides:



Advertisements
Similar presentations
Supplemental Instruction in Precalculus
Advertisements

Introduction to Service-Learning for Students
June 10-15, 2012 Growing Community; Growing Possibilities Josh Baron, Marist College Sandeep Jayaprakash, Marist College Nate Angell, rSmart.
Foundational Skills Helping Students Get Ready for College EAP - Working Together to Prepare Students for College Jeff Gold - Director Academic Technology,
INACOL National Standards for Quality Online Teaching, Version 2.
The Impact of a Faculty Learning Community Approach on Pre-Service Teachers’ English Learner Pedagogy Michael P. Alfano, John Zack, Mary E. Yakimowski,
Want to be first in your CLASSE? Investigating Student Engagement in Your Courses Want to be first in your CLASSE? Investigating Student Engagement in.
Principles of Assessment
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
Eileen O’Brien, Ph.D. Department of Psychology Tampa, Fl December, 2011.
SENSE 2013 Findings for College of Southern Idaho.
Implementing Active Learning Strategies in a Large Class Setting Travis White, Pharm.D., Assistant Professor Kristy Lucas, Pharm.D., Professor Pharmacy.
Asynchronous Discussions and Assessment in Online Learning Vonderwell, S., Liang, X., & Alderman, K. (2007). Asynchronous Discussions and Assessment in.
Statistics Course Redesigns at NCSU Roger Woodard.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
8th Sakai Conference4-7 December 2007 Newport Beach Teaching and Learning Community Gathering Let’s talk Teaching with Sakai.
© 2015 Jenzabar, Inc. Increase Your Prediction Accuracy and Achieve Maximum Student Success Paul Gore Xavier University Burt Rubenstein.
The Redesigned Elements of Statistics Course University of West Florida March 2008.
Proactive help for your students. ›“Supplemental Instruction (SI) is a student academic assistance program that can increase student performance and retention”
Learning Analytics – The Apereo Approach ESUP DAYS & APEREO EUROPE 2016 TUESDAY, FEBRUARY 2, 2016.
Developing metrics and predictive algorithms for your institution– Marist Story JISC LEARNING ANALYTICS NETWORK EVENT SANDEEP JAYAPRAKASH.
New Approaches to Old Problems: how “guided pathways” can lead to student success Paul N. Markham Program Officer, Postsecondary Success TASS Conference.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
Use of Learning Analytics in Massively Open Online Courses.
edTPA: Task 1 Support Module
How To Build An Assessment And Impact Model Dr. Suzan Harkness
Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks
Tips and Techniques for Online Instructors
Doctoral Program Orientation
Response to Intervention & Positive Behavioral Intervention & Support
Instructional Design Groundwork:
PDP Presentation Trinity Washington University
#AdaptiveMap || The Challenges of Adaptive Content: The Perspective of two Universities Amanda Buckley (Bay Path University)
A community of learners improving our world
First-Year Experience Seminars: A Benchmark Study of Targeted Courses for Developmental Education Students.
Bringing Active Learning to Scale at Bronx Community College (BCC) of the City University of New York (CUNY) Dr. Nancy Ritze August 3, 2016.
Module 2: Introduction to Using OER for Math Instruction
(includes online “demo” video)
Mid-Atlantic Regional Conference
Diagnostics and Goal-Setting in One Hour or Less
Engaging the Online Student Wake Technical Community College
The General Education Core in CLAS
The University of Texas-Pan American
Evidence-Based Practices: Tier 1
Exploring and Using the new foundations of Education (3rd edition) Connection Chapters to promote Literacy Instruction Dr. Dawn Anderson from Western Michigan.
THE JOURNEY TO BECOMING
Stoplights for student success
Presented by: Joseph Ginotti PLN Director
Strengths-Based Advising
Dr. Laura K. Murray National Security Agency
Helen Zaikina-Montgomery, Ph.D. Scott Burrus, Ph.D.
Goal Orientation Theory
The purposes of grading student work
Assessment Day 2017 New Student Experience Presented by Jenny Lee
Imagine Success Engaging Entering Students Innovations 2009
STEM Ambassadors – an overview
Goal Orientation Theory
Jane Russell, Anna Smith, Alexandra Nica & Ross Miller
Leanne Havis, Ph.D., Neumann University
Toward a New Paradigm for Student Success
Opening Up Learning Analytics: Addressing a Strategic Imperative
Integrating digital technologies: Three cases of visual learning Professor Robert Fitzgerald Charles Darwin University IRU Senior Leaders Forum 2018.
K–8 Session 1: Exploring the Critical Areas
The Heart of Student Success
Giving Diverse & Underserved Students a Leg up on Student Success
ROSE STATE COLLEGE   “CLICK”  Community Learning in Critical Knowledge
Dr. Paul Dosal, Vice President - Student Affairs & Student Success
Assessment Day 2017 New Student Experience Presented by Jenny Lee
Finalization of the Action Plans and Development of Syllabus
How to Use Learning Analytics in Moodle
Presentation transcript:

OAAI: Deploying an Open Ecosystem for Learning Analytics Use Hash Tags > #EDU13 and #OAAI

Use Hash Tags > #EDU13 and #OAAI Open Academic Analytics Initiative Deploying an Open Ecosystem for Learning Analytics Josh Baron Senior Academic Technology Officer Principle Investigator, OAAI Marist College Josh.Baron@Marist.edu Use Hash Tags > #EDU13 and #OAAI

36% Reference: Integrated Postsecondary Education Data System (IPEDS)

Four Year Graduation Rate for Savannah State University (2010) 13% Four Year Graduation Rate for Savannah State University (2010) Shockingly, was a low as 3% in 2002!

Open Academic Analytics Initiative EDUCAUSE Next Generation Learning Challenges (NGLC) Funded by Bill and Melinda Gates Foundations $250,000 over a 15 month period Goal: Leverage Learning Analytics to create an open-source academic early alert system and research “scaling factors” OK, so what is the OAAI and how are we working to address this problem…with the goal of leveraging Big Data to create an open-source academic early alert system that allows us to predict which students are at risk to not complete the course (and do so early on in the semester) and then deploy an intervention to help that student succeed.

OAAI Early Alert System Overview Step #1: Developed model using historical data Student Attitude Data (SATs, current GPA, etc.) SIS Data Academic Alert Report (AAR) Identifies students “at risk” to not complete course Student Demographic Data (Age, gender, etc.) Predictive Model Scoring LMS Data Sakai Event Log Data Model Developed Using Historical Data Sakai Gradebook Data I’ll talk about our intervention strategies in a little more detail a bit later on in the presentation… “Creating an Open Academic Early Alert System” Intervention Deployed “Awareness” or Online Academic Support Environment (OASE)

OAAI Goals and Milestones Build learning analytics-based early alert system Sakai Collaboration and Learning Environment Secure data capture process for extracting LMS data Pentaho Business Intelligence Suite Open-source data mining, integration, analysis & reporting OAAI Predictive Model released under open license Predictive Modeling Markup Language Researching learning analytics scaling factors How “portable” are predictive models? What intervention strategies are most effective? OAAI is building on this prior success through two primary “thrusts”…the first is the creation of an “open ecosystem” for academic analytics…this ecosystem contains the Sakai… We will also be open sourcing our predictive model and releasing it using a standard mark-up language which will allow… The other major “trust” is researching critical scaling factors…so we are looking at this issue of “portability”…how can you effective take a model developed for one academic context (e.g. liberal arts school) and “port” it to another context (e.g. community college). We are also looking at what are the most effective intervention strategies, particularly those which use…

Research Design Deployed OAAI system to 2200 students across four institutions Two Community Colleges Two Historically Black Colleges and Universities One instructors teaching three sections One section was control, other two sections were treatment groups

Research Design Each instructor received an Academic Alert Report (AAR) three times during the semester at 25%, 50% and 75% into the semester.

Development and Portability Research Findings OAAI Predictive Model Development and Portability Research Findings

Predictive Model Training

Predictors of Student Risk LMS predictors were measured relative to course averages. Some predictors were discarded if not enough data was available.

Predictive Power

Initial Portability Research Findings Compared predictive elements and correlations in the Marist data to what was found by Dr. John Campbell in his dissertation research at Purdue.

Institutional Profiles

Spring ’12 Portability Findings

Fall ’12 Portability Findings

Intervention Strategies Design, Deployment and Research Findings

OAAI Early Alert System Overview Step #1: Developed model using historical data Student Attitude Data (SATs, current GPA, etc.) SIS Data Academic Alert Report (AAR) Identifies students “at risk” to not complete course Student Demographic Data (Age, gender, etc.) Predictive Model Scoring LMS Data Sakai Event Log Data Model Developed Using Historical Data Sakai Gradebook Data I’ll talk about our intervention strategies in a little more detail a bit later on in the presentation… “Creating an Open Academic Early Alert System” Intervention Deployed “Awareness” or Online Academic Support Environment (OASE)

Awareness Messaging “Based on your performance on recent graded assignments and exams, as well as other factors that tend to predict academic success, I am becoming worried about your ability to successfully complete this class. I am reaching out to offer some assistance and to encourage you to consider taking steps to improve your performance. Doing so early in the semester will increase the likelihood of you successfully completing the class and avoid negatively impacting on your academic standing.”

Online Academic Support Environment (OASE)

OASE Design Framework Follows online course design concepts Learner – Content Interactions Learner – Facilitator Interactions Learner – Mentor Interactions Leverages Open Educational Resources The approach we have taken is to create a “design framework” for the OASE

Design Frame #1 Learner-Content Interactions Self-Assessment Instruments Assist students in self-identifying areas of weakness OER Content for Remediation Focus on core subjects (math, writing) OER Content for Learning Skills Time management, test taking strategies, etc. Include some screenshots of examples from Khan, etc.

Design Frame #2 Learner - Facilitator Interaction Academic Learning Specialist role Connecting learners to people & services Promoting services and special events Moderates discussions on pertinent topics Example: “Your first semester at college” Note that the goal is not to replace tutoring services or answer a lot of subject matter questions online.

Design Frame #3 Learner - Mentor Interactions Online interactions facilitated by student “mentor” Facilitates weekly “student perspective” discussions Online “student lounge” for informal interactions Let others know about study groups, etc. Note that the goal is not to replace tutoring services or answer a lot of subject matter questions online.

Intervention Research Findings Final Course Grades Analysis showed a statistically significant positive impact on final course grades No difference between treatment groups Saw larger impact in spring then fall Similar trend amount low income students

Intervention Research Findings Content Mastery Student in intervention groups were statistically more likely to “master the content” then those in controls. Content Mastery = Grade of C or better Similar for low income students.

Intervention Research Findings Withdrawals Students in intervention groups withdrew more frequently than controls Possibly due to students avoiding withdrawal penalties. Consistent with findings from Purdue University

Instructor Feedback "Not only did this project directly assist my students by guiding students to resources to help them succeed, but as an instructor, it changed my pedagogy; I became more vigilant about reaching out to individual students and providing them with outlets to master necessary skills. P.S. I have to say that this semester, I received the highest volume of unsolicited positive feedback from students, who reported that they felt I provided them exceptional individual attention!

Future Research Interests Factors that impact on intervention effectiveness Intervention Immunity – Students who do not respond to first intervention never do Student Engagement – How can we increase the level of engagement with OASE? Can predictive models be customized for specific delivery methods and programs/subjects? Can Learning Analytics identify “at risk” students who would otherwise not be identified?

Social Networks Adapting Pedagogical Practice (SNAPP)

Learning Analytics and Knowledge Conference (LAK) The International Journal of the Society for Learning Analytics Research SOLAR Flares – Regional Conferences SOLAR Storms – Distributed Research Lab MOOCs on Learning Analytics http://www.solaresearch.org/

Josh Baron Josh.Baron@Marist.edu @JoshBaron Questions? Josh Baron Josh.Baron@Marist.edu @JoshBaron

Big Questions about Big Data Where is Learning Analytics and Big Data in the “hype cycle”? Where will it end up? What do you see as the biggest ethical issues surrounding Learning Analytics? How important is “openness” in Learning Analytics? Open standards? Open licensing? Could Learning Analytics end up driving the wrong learning outcomes?

Josh Baron Josh.Baron@Marist.edu @JoshBaron Questions? Josh Baron Josh.Baron@Marist.edu @JoshBaron

Learning Analytics Ethics “The obligation of knowing” – John Campbell If we have the data and tools to improve student success, are we obligated to use them? Consider This > If a student has a 13% chance of passing a course, should they be dropped? 3%? Who owns the data, the student? Institution? Should students be allowed to “opt out”? Consider This > Is it fair to the other students if by opting out the predictive model’s power drops? What do we reveal to students? Instructors? Consider This > If we tell a student in week three they have a 9% chance of passing, what will they do? Will instructors begin to “profile” students? If we determine a student is “at risk”, are we obligated to intervene?