Download presentation
Presentation is loading. Please wait.
Published byNatalia Field Modified over 9 years ago
1
Richard Baraniuk OpenStax Courseware
2
courseware vision phase 1 – reinvent the textbook $$$
3
open digital content open ed publishing platform established in 1999 25,000 learning objects millions of users per month library of 25 free and open college textbooks addresses “access gap” for disadvantaged students 17 ecosystem partners 1099 adoptions, saving 300,000 students over $30M
5
courseware vision phase 2 – personalize the course
6
courseware vision phase 2 – personalize the course
7
1.broader access to high-quality courseware 2.improve learning using modern science (machine learning, cognitive science) 3.validation in real classrooms + research goals
9
digital assessment in use at 12 colleges (Rice, Georgia Tech, Duke, UT El Paso, …) built-in research infrastructure integrated cognitive science principles (collaborators at Duke, UT-Austin, WashU) flexible platform for practice, assessment, and learning research
10
learning principles retrieval practice –retrieving information from memory is not a neutral event; rather it changes memory spacing –distributing practice over time produces better long-term retention than massing practice feedback –closes the learning feedback loop –must be timely two-step answer process engages students in retrieval practice spaced concept practice timely, informative feedback
11
research verification experiment at Rice 2012 findings: students using cognitive science principles in OST scored ½-1 GPA point better than those using standard practice homework flexible platform for practice, assessment, and learning research
12
content learning analytics assess and track student learning progress by analyzing their interactions with content
13
content content analytics determine relationships among content elements
14
learning/content analytics classical approach – knowledge engineering –domain experts pore over content, assessments, data, tagging and building rules –fragile, expensive, not scalable, not transferable modern approach – machine learning –learn directly from data –automatic –robust, inexpensive, scalable, transferable
15
standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet students problems
16
questions (w/ estimated inherent difficulty) concepts student knowledge profile 87 55 23 93 62 Patty
17
data ML AlgsCog Sci personalized next task analytics to instructor feedback and analytics to student
18
curriculum (re)design personalized learning pathways cognitive science research machine learning cycles of innovation
19
crossing the courseware chasm The Mainstream Market Technology Enthusiasts VisionariesPragmatistsConservativesSkeptics
20
crossing the courseware chasm The Mainstream Market Technology Enthusiasts VisionariesPragmatistsConservativesSkeptics
21
long term impact “There is not such a cradle of democracy upon the earth as the Free Public Library” building the personalized courseware library of the future
22
sparfa
23
students problems sparse factor analysis Goal: using only “grade book” data white: correct response black: incorrect response grey:unobserved infer: 1.the concepts underlying the questions (content analytics) 2.each student’s “ knowledge ” of each underlying concept (learning analytics)
24
from grades to concepts students problems data –graded student responses to unlabeled questions –large matrix with entries: white: correct response black: incorrect response grey:unobserved standard practice –instructor’s “grade book” = sum/average over each column goal –infer underlying concepts and student understanding without question-level metadata
25
students problems data –graded student responses to unlabeled questions –large matrix with entries: white: correct response black: incorrect response grey:unobserved goal –infer underlying concepts and student understanding without question-level metadata key observation –each question involves only a small number of “concepts” (low rank) from grades to concepts
26
students problems ~ Ber statistical model converts to 0/1 (probit or logistic coin flip transformation) estimate of each student’s ability to solve each problem (even unsolved problems) red = strong ability blue = weak ability
27
students problems + SPARse Factor Analysis ~ Ber
28
students problems + students concepts SPARFA each problem involves a combination of a small number of key “concepts” each student’s knowledge of each “concept” each problem’s intrinsic “difficulty” ~ Ber
29
students problems solving SPARFA factor analyzing the grade book matrix is a severely ill-posed problem significant recent progress in relaxation-based optimization for sparse/low-rank problems –matrix based methods(SPARFA-M) –Bayesian methods(SPARFA-B) similar to compressive sensing
30
standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet
31
questions (w/ estimated inherent difficulty) concepts student knowledge profile 87 55 23 93 62
32
technology architecture
33
marketing and adoption research partners will co-develop –Salt Lake Community College, University of Georgia pilot partners will field test –The Ohio State University, Auburn University, University System of Georgia-Online Courses, Central New Mexico College, South Florida State College, Maricopa CC District, Tarrant County CC scale-up — key elements –fit into existing faculty/student workflow –build an ecosystem of affiliate partners –execute advertising and marketing campaigns –employ viral new media approaches –employ direct marketing and customer relationship management system proven success 2012-2014
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.