Richard Baraniuk Mr. Lan, Andrew Waters, Christoph Studer learning and content analytics.

Slides:



Advertisements
Similar presentations
Science Starters Week 1 ILO 1 Observations and Inferences.
Advertisements

The Difference Between Assessment and Evaluation
Introduction Relative weights can be estimated by fitting a linear model using responses from individual trials: where g is the linking function. Relative.
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Richard Baraniuk OpenStax Courseware. courseware vision phase 1 – reinvent the textbook $$$
Finding, Evaluating, and Using Numeric Data [IMS 201, Statistical Literacy] [Electronic Data Center] This presentation will probably involve audience discussion,
Examining Instruction, Part I Nov. 7, Plan for Today 4:10-4:15 Welcome and Overview 4:15-5:00 Tuning Protocol on Data Overviews 5:00-5:50 Observing.
How can blood type be used to exclude paternity?
Maths Counts Insights into Lesson Study 1. Maths Department, Our Lady’s College Geometry in Context Transition Year and Ordinary Level Junior Cert 2.
Science Inquiry Minds-on Hands-on.
KPIs: Definition and Real Examples
Interactive Science Notebooks: Putting the Next Generation Practices into Action
Challenges in Developing a University Admissions Test & a National Assessment A Presentation at the Conference On University & Test Development in Central.
AP Statistics Overview and Basic Vocabulary. Key Ideas The Meaning of Statistics Quantitative vs. Qualitative Data Descriptive vs. Inferential Statistics.
Assessing and Improving Numeracy Skills in Undergraduate Psychology Students Paul Bishop, Leili Soo & David Sutherland School of Psychology, University.
Chapter 4 Designing Significant Learning Experiences II: Shaping the Experience.
A CONNECTION BETWEEN SELF REPORTED GRADES AND FEEDBACK Jaimee Sutherland Farmington Middle School.
PSLC DataShop Introduction Slides current to DataShop version John Stamper DataShop Technical Director.
VERSION 2 TO DO Upgrade logos to high res Add in “demo” slides Add in “backup” slides.
Personalized Learning Workshop Office of the Provost George R. Brown School of Engineering Ken Kennedy Institute for Information Technology STEMScopes.
Sparse Factor Analysis for Learning Analytics Andrew Waters, Andrew Lan, Christoph Studer, Richard Baraniuk Rice University.
1 Inference for Categorical Data William P. Wattles, Ph. D. Francis Marion University.
What is it? What information does it give us? How can we use this to guide our instruction?
Copyright © Ed2Net Learning, Inc.1 What is science? Grade 7 Unit 1 : Lesson #1.
How to Use Your Textbook. The Parts of the Textbook Front to Back SectionPage NumbersPurpose Table of ContentsCA7-CA26Quick Reference Guide to the major.
Analyzing Research Data and Presenting Findings
Based on the work of Heidi Hayes Jacobs and Susan Udelhofen
A COMPARISON METHOD OF EQUATING CLASSIC AND ITEM RESPONSE THEORY (IRT): A CASE OF IRANIAN STUDY IN THE UNIVERSITY ENTRANCE EXAM Ali Moghadamzadeh, Keyvan.
SDI IDC Boosting your score Preparing a Classroom Lesson.
Inquiry-Based Learning How It Looks, Sounds and Feels.
Science Process Skills Vocabulary 8/17/15. Predicting Forming an idea of an expected result. Based on inferences.
Science Instructional Guide For High School Integrated Coordinated Science Biology and Chemistry Professional Development Module 1 Introduction to the.
Pairwise Preference Regression for Cold-start Recommendation Speaker: Yuanshuai Sun
Data-Based Problem Solving and the NU Data Pragmatics A Case Example.
1 InSAR Project Information. 2 Outline Purpose Two components – SAR investigation, InSAR study SAR investigation: presentation only InSAR study: presentation.
Welcome! A-Team Session 4 Fall Overview for Today 1 Minute Message review Outcomes Feedback Methodology Review Assessment Steps  Sampling  Validity/Reliability.
The TEAK Project 1 TEAK/TA Teaching Workshop Session 4: Classroom Assessment Techniques Dr. Elizabeth DeBartolo, Mechanical Engineering Dr. Margaret Bailey,
The TEAK Project 1 TEAK/TA Teaching Workshop Session 4: Classroom Assessment Techniques Dr. Elizabeth DeBartolo, Mechanical Engineering Dr. Margaret Bailey,
Tracking Progress of Pupils at Copthorne CE Junior September 2015 to July 2016.
Science Process Skills Vocabulary 8/16/04. Predicting Forming an idea of an expected result. Based on inferences.
MAP TESTING MEASURES OF ACADEMIC PROGRESS DECEMBER 9, 2014.
Student Data Notebooks Data Driven Decisions in the Classroom.
Mathematics Initiative Office of Superintendent of Public Instruction Mathematics Initiative Office of Superintendent of Public Instruction CAA Options.
Welcome Science Teachers! Today Propel/ASSET Notebooking 8:30-11:30 Resources PTEI 5-HS 12:00-3:30 Kristen Golomb: Science Coach ASSET Resources Lesson.
© 2013, KDE and KASA. All rights reserved. FOUNDATIONS OF STUDENT GROWTH GOAL SETTING: DETERMINING STUDENT NEEDS SETTING A BASELINE What do my students.
Critical thinking in the context of biology CETL Workshop February 25, 2015 Ben Montgomery NSE (Biology)
SCIENCE TEST 35 Minutes; 40 Questions; 7 Passages 5 – 7 questions per passage 5 minutes per passage Evaluates your ability to reason scientifically 3 Question.
Interactive Science Notebooks. What are Interactive Science Notebooks? A student thinking tool And organizer for inquiry questions and what I learned…
FSM National Standardized Test for Teachers Science Test Evaluation.
Team Based Learning.
CS4311 Spring 2011 Process Improvement Dr
The Scientific Method.
Problem-based learning
Chapter 7 Analyzing Research Data and Presenting Findings
Supplier Performance Management (SPM): Scoring and Grading
Facilitator Linda C. Hodges
What Are Rubrics? Rubrics are components of:
Inference for Categorical Data
Addressing the Assessing Challenge with the ASSISTment System
ACT ASPIRE UNDERSTANDING
Batteries, Bulbs, and Wires!
False discovery rate estimation
Scientific Content on ACT
Experimental Probability
A Model for Building and Using Professional Learning Communities
Mr. Pettinato Welcome to G.P.S..
Creating Common Formative Assessments that Advance Student Learning
Notebook Set up 3 or 5 subject spiral.
Presentation transcript:

Richard Baraniuk Mr. Lan, Andrew Waters, Christoph Studer learning and content analytics

learning analytics Goal:assess and track student learning progress by analyzing their interactions with content data (massive, rich, personal) close the learning feedback loop

content learning analytics assess and track student learning progress by analyzing their interactions with content

content learning analytics assume content is organized (“knowledge graph”)

“While such results are promising, perhaps it's a little soon to crown Inquire the future of textbooks. For starters, after two years of work the system is still only half-finished. The team plan to encode the rest of the 1400-page Campbell Biology by the end of 2013, but they expect a team of 18 biologists will be needed to do so. This raises concerns about whether the project could be expanded to cover other areas of science, let alone other subjects.”

content learning analytics content analytics

standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet

standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet Goal: using only “grade book” data, infer: 1.the concepts underlying the questions (content analytics) 2.each student’s “knowledge” of each underlying concept (learning analytics)

from grades to concepts students problems data –graded student responses to unlabeled questions –large matrix with entries: white: correct response black: incorrect response grey:unobserved standard practice –instructor’s “grade book” = sum/average over each column goal –infer underlying concepts and student understanding without question-level metadata

students problems data –graded student responses to unlabeled questions –large matrix with entries: white: correct response black: incorrect response grey:unobserved goal –infer underlying concepts and student understanding without question-level metadata key observation –each question involves only a small number of “concepts” (low rank) from grades to concepts

students problems ~ Ber statistical model converts to 0/1 (probit or logistic coin flip transformation) estimate of each student’s ability to solve each problem (even unsolved problems) red = strong ability blue = weak ability

students problems + SPARse Factor Analysis ~ Ber

students problems + students concepts SPARFA each problem involves a combination of a small number of key “concepts” each student’s knowledge of each “concept” each problem’s intrinsic “difficulty” ~ Ber

students problems solving SPARFA factor analyzing the grade book matrix is a severely ill-posed problem significant recent progress in relaxation-based optimization for sparse/low-rank problems –matrix based methods(SPARFA-M) –Bayesian methods(SPARFA-B) similar to compressive sensing

standard practice Johnny Eve Patty Neelsh Nora Nicholas Barbara Agnes Vivek Bob Fernando Sarah Hillary Judy Janet Grade 8 science 80 questions 145 students 1353 problems solved (sparsely) learned 5 concepts

Grade 8 science 80 questions 145 students 1353 problems solved (sparsely) 5 concepts

questions (w/ estimated inherent difficulty) concepts student knowledge profile

summary scaling up personalized learning requires that we exploit the massive collection of relatively unorganized educational content can estimate content analytics on this collection as we estimate learning analytics related work: Rasch model, IRT integrating SPARFA into

Mr. LanAndrew WatersChristoph Studer.com