Download presentation
Presentation is loading. Please wait.
Published byBertina Fleming Modified over 9 years ago
1
Richard Baraniuk Mr. Lan, Andrew Waters, Christoph Studer learning and content analytics
2
learning analytics Goal:assess and track student learning progress by analyzing their interactions with content data (massive, rich, personal) close the learning feedback loop
3
content learning analytics assess and track student learning progress by analyzing their interactions with content
4
content learning analytics assume content is organized (“knowledge graph”)
5
http://www.newscientist.com/article/mg21528765.700-the-intelligent-textbook-that-helps-students-learn.html “While such results are promising, perhaps it's a little soon to crown Inquire the future of textbooks. For starters, after two years of work the system is still only half-finished. The team plan to encode the rest of the 1400-page Campbell Biology by the end of 2013, but they expect a team of 18 biologists will be needed to do so. This raises concerns about whether the project could be expanded to cover other areas of science, let alone other subjects.”
6
content learning analytics content analytics
7
standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet
8
standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet Goal: using only “grade book” data, infer: 1.the concepts underlying the questions (content analytics) 2.each student’s “knowledge” of each underlying concept (learning analytics)
10
from grades to concepts students problems data –graded student responses to unlabeled questions –large matrix with entries: white: correct response black: incorrect response grey:unobserved standard practice –instructor’s “grade book” = sum/average over each column goal –infer underlying concepts and student understanding without question-level metadata
11
students problems data –graded student responses to unlabeled questions –large matrix with entries: white: correct response black: incorrect response grey:unobserved goal –infer underlying concepts and student understanding without question-level metadata key observation –each question involves only a small number of “concepts” (low rank) from grades to concepts
12
students problems ~ Ber statistical model converts to 0/1 (probit or logistic coin flip transformation) estimate of each student’s ability to solve each problem (even unsolved problems) red = strong ability blue = weak ability
13
students problems + SPARse Factor Analysis ~ Ber
14
students problems + students concepts SPARFA each problem involves a combination of a small number of key “concepts” each student’s knowledge of each “concept” each problem’s intrinsic “difficulty” ~ Ber
15
students problems solving SPARFA factor analyzing the grade book matrix is a severely ill-posed problem significant recent progress in relaxation-based optimization for sparse/low-rank problems –matrix based methods(SPARFA-M) –Bayesian methods(SPARFA-B) similar to compressive sensing
16
standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet Grade 8 science 80 questions 145 students 1353 problems solved (sparsely) learned 5 concepts
17
Grade 8 science 80 questions 145 students 1353 problems solved (sparsely) 5 concepts
18
questions (w/ estimated inherent difficulty) concepts student knowledge profile 87 55 23 93 62
19
summary scaling up personalized learning requires that we exploit the massive collection of relatively unorganized educational content can estimate content analytics on this collection as we estimate learning analytics related work: Rasch model, IRT integrating SPARFA into
20
Mr. LanAndrew WatersChristoph Studer.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.