Richard Baraniuk OpenStax Courseware
courseware vision phase 1 – reinvent the textbook $$$
open digital content open ed publishing platform established in ,000 learning objects millions of users per month library of 25 free and open college textbooks addresses “access gap” for disadvantaged students 17 ecosystem partners 1099 adoptions, saving 300,000 students over $30M
courseware vision phase 2 – personalize the course
courseware vision phase 2 – personalize the course
1.broader access to high-quality courseware 2.improve learning using modern science (machine learning, cognitive science) 3.validation in real classrooms + research goals
digital assessment in use at 12 colleges (Rice, Georgia Tech, Duke, UT El Paso, …) built-in research infrastructure integrated cognitive science principles (collaborators at Duke, UT-Austin, WashU) flexible platform for practice, assessment, and learning research
learning principles retrieval practice –retrieving information from memory is not a neutral event; rather it changes memory spacing –distributing practice over time produces better long-term retention than massing practice feedback –closes the learning feedback loop –must be timely two-step answer process engages students in retrieval practice spaced concept practice timely, informative feedback
research verification experiment at Rice 2012 findings: students using cognitive science principles in OST scored ½-1 GPA point better than those using standard practice homework flexible platform for practice, assessment, and learning research
content learning analytics assess and track student learning progress by analyzing their interactions with content
content content analytics determine relationships among content elements
learning/content analytics classical approach – knowledge engineering –domain experts pore over content, assessments, data, tagging and building rules –fragile, expensive, not scalable, not transferable modern approach – machine learning –learn directly from data –automatic –robust, inexpensive, scalable, transferable
standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet students problems
questions (w/ estimated inherent difficulty) concepts student knowledge profile Patty
data ML AlgsCog Sci personalized next task analytics to instructor feedback and analytics to student
curriculum (re)design personalized learning pathways cognitive science research machine learning cycles of innovation
crossing the courseware chasm The Mainstream Market Technology Enthusiasts VisionariesPragmatistsConservativesSkeptics
crossing the courseware chasm The Mainstream Market Technology Enthusiasts VisionariesPragmatistsConservativesSkeptics
long term impact “There is not such a cradle of democracy upon the earth as the Free Public Library” building the personalized courseware library of the future
sparfa
students problems sparse factor analysis Goal: using only “grade book” data white: correct response black: incorrect response grey:unobserved infer: 1.the concepts underlying the questions (content analytics) 2.each student’s “ knowledge ” of each underlying concept (learning analytics)
from grades to concepts students problems data –graded student responses to unlabeled questions –large matrix with entries: white: correct response black: incorrect response grey:unobserved standard practice –instructor’s “grade book” = sum/average over each column goal –infer underlying concepts and student understanding without question-level metadata
students problems data –graded student responses to unlabeled questions –large matrix with entries: white: correct response black: incorrect response grey:unobserved goal –infer underlying concepts and student understanding without question-level metadata key observation –each question involves only a small number of “concepts” (low rank) from grades to concepts
students problems ~ Ber statistical model converts to 0/1 (probit or logistic coin flip transformation) estimate of each student’s ability to solve each problem (even unsolved problems) red = strong ability blue = weak ability
students problems + SPARse Factor Analysis ~ Ber
students problems + students concepts SPARFA each problem involves a combination of a small number of key “concepts” each student’s knowledge of each “concept” each problem’s intrinsic “difficulty” ~ Ber
students problems solving SPARFA factor analyzing the grade book matrix is a severely ill-posed problem significant recent progress in relaxation-based optimization for sparse/low-rank problems –matrix based methods(SPARFA-M) –Bayesian methods(SPARFA-B) similar to compressive sensing
standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet
questions (w/ estimated inherent difficulty) concepts student knowledge profile
technology architecture
marketing and adoption research partners will co-develop –Salt Lake Community College, University of Georgia pilot partners will field test –The Ohio State University, Auburn University, University System of Georgia-Online Courses, Central New Mexico College, South Florida State College, Maricopa CC District, Tarrant County CC scale-up — key elements –fit into existing faculty/student workflow –build an ecosystem of affiliate partners –execute advertising and marketing campaigns –employ viral new media approaches –employ direct marketing and customer relationship management system proven success