CRESST ONR/NETC Meetings, 17-18 July 2003, v1 17 July, 2003 ONR Advanced Distributed Learning Greg Chung Bill Bewley UCLA/CRESST Ontologies and Bayesian.

Slides:



Advertisements
Similar presentations
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Advertisements

The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Well written objectives will… Provide clear direction for instruction Convey what is to be learned to students Provide a clear guide for assessment.
CRESST ONR/NETC Meetings, July 2003, v1 17 July, 2003 ONR Advanced Distributed Learning Bill BewleyAllen Munro Greg ChungJosh Walker Girlie Delacruz.
Modeling Human Reasoning About Meta-Information Presented By: Scott Langevin Jingsong Wang.
By: Shannon Immegart Technology is present in every part of our life, community, and home Technology prepares students for a highly technological knowledge-based.
1/∞ CRESST/UCLA Towards Individualized Instruction Using Technology Gregory K. W. K. Chung Annual CRESST Conference Los Angeles, CA – January 22, 2007.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments 1 Introduction to Comparability Inclusive Assessment Seminar.
12 -1 Lecture 12 User Modeling Topics –Basics –Example User Model –Construction of User Models –Updating of User Models –Applications.
University of Hertfordshire, Division of Sports Science, Sports Studies, and Sports Therapy, College Lane, Hatfield, Hertfordshire, AL10 9AB.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Resource allocation for disability - NDA feasibility study Eithne Fitzgerald Head of Policy and Research National Disability Authority.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
CRESST ONR/NETC Meetings, July 2003, v1 18 July 2003 The Workforce Learning Community Bill Bewley and Roxanne Sylvester UCLA / CRESST The ONR Workplace.
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Interstate New Teacher Assessment and Support Consortium (INTASC)
The Framework for Teaching Domain 1 Planning and Preparation.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Ensuring State Assessments Match the Rigor, Depth and Breadth of College- and Career- Ready Standards Student Achievement Partners Spring 2014.
FDA Approach to Review of Outcome Measures for Drug Approval and Labeling: Content Validity Initiative on Methods, Measurement, and Pain Assessment in.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
© Regents of University of California 1 Functional Validity: Extending the Utility of State Assessments Eva L. Baker, Li Cai, Kilchan Choi, Ayesha Madni.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Online Assessment within.
CRESST ONR/NETC Meetings, July 2003, v1 ONR Advanced Distributed Learning Linguistic Modification of Test Items Jamal Abedi University of California,
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
Supporting Civil-Military Information Integration in Military Operations Other than War Paul Smart, Alistair Russell and Nigel Shadbolt
Experimental Research Methods in Language Learning Chapter 1 Introduction and Overview.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V3, 4/9/07 SIG: Technology.
CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil.
CRESST ONR/NETC Meetings, July 2003, v1 ONR Advanced Distributed Learning Impact of Language Factors on the Reliability and Validity of Assessment.
Baker ONR/NETC July 03 v.4  2003 Regents of the University of California ONR/NETC Planning Meeting 18 July, 2003 UCLA/CRESST, Los Angeles, CA ONR Advanced.
ONR/NSF Technology Assessment of Web-Based Learning, v3 © Regents of the University of California 6 February 2003 ONR/NSF Technology Assessment of Web-Based.
Illustration of a Validity Argument for Two Alternate Assessment Approaches Presentation at the OSEP Project Directors’ Conference Steve Ferrara American.
Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.
Concept Mapping as a Window into Student Understanding Biology Scholars Program SoTL Institute July, 2010 William Cliff Department of Biology Niagara University.
BAA - Big Mechanism using SIRA Technology Chuck Rehberg CTO at Trigent Software and Chief Scientist at Semantic Insights™
C R E S S T / U C L A Alicia M. Cheak, Gregory K. W. K. Chung, Eva L. Baker, Cecile H. Phan, and Linda F. de Vries American Educational Research Association,
California Educational Research Association Annual Meeting Rancho Mirage, CA – December 5, 2008 Hoky Min, Gregory K. W. K. Chung, Rebecca Buschang, Lianna.
Teaching to the “Big Ideas”: Moving beyond the standards Terry P. Vendlinski UCLA Graduate School of Education & Information Studies National Center for.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
CRESST ONR/NETC Meetings, July July, 2003 ONR Advanced Distributed Learning Greg ChungAllen Munro Bill BewleyQuentin Pizzini Girlie DelacruzUSC/BTL.
Online Assessment for Individualized Distributed Learning Applications Greg Chung UCLA Graduate School of Education & Information Studies National Center.
1 Assessing Student Understanding David Niemi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards,
You Can’t Afford to be Late!
Using Bayesian Networks to Predict Plankton Production from Satellite Data By: Rob Curtis, Richard Fenn, Damon Oberholster Supervisors: Anet Potgieter,
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V4, 1/18/07 Research.
1 Oregon Standards Evaluation Project, Contract Amendment Phase: Summary of Preliminary Findings Dr. Stanley Rabinowitz WestEd December 6, 2007.
Accelerating Future Possibilities for Assessment and Learning Technology-Enabled Measurement: Looking Back to Move Ahead Greg Chung UCLA Graduate School.
GOALS 1. To understand RIBTS 9.5 as part of the process of assessing student learning. 2. To self-reflect on assessment information (e.g. pre-, post-,
Lesson 3 Measurement and Scaling. Case: “What is performance?” brandesign.co.za.
Implementing the Professional Growth Process Session 3 Observing Teaching and Professional Conversations American International School-Riyadh Saturday,
Managing Qualitative Knowledge in Software Architecture Assesment Jilles van Gurp & Jan Bosch Högskolan Karlskrona/Ronneby in Sweden Department of Software.
CRESST ONR/NETC Meetings, July July, 2003 ONR Advanced Distributed Learning Bill Kaiser UCLA/SEAS Wireless Networked Sensors for Assessment.
English Extension 1 Preliminary Course. A Word From BOS  2 English (Extension) 12.1 Structure  The Preliminary English (Extension) course consists of.
Harry O'Neil University of Southern California and The Center for Research on Evaluation, Standards, and Student Testing A Theoretical Basis for Assessment.
Knowing What Students Know Ganesh Padmanabhan 2/19/2004.
You Can’t Afford to be Late!
Quality Assurance processes
INTRODUCTION TO THE ELPAC
Classroom Assessment Validity And Bias in Assessment.
Research on Geoscience Learning
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Assessment Information
Understanding a Skills-Based Approach
Research on Geoscience Learning
Analyzing Reliability and Validity in Outcomes Assessment
Innovative Approaches for Examining Alignment
TPS Workshop Objectives
Research-based Active-Learning Instruction in Physics
Presentation transcript:

CRESST ONR/NETC Meetings, July 2003, v1 17 July, 2003 ONR Advanced Distributed Learning Greg Chung Bill Bewley UCLA/CRESST Ontologies and Bayesian Networks in Assessment 2003 Regents of the University of California

CRESST ONR/NETC Meetings, July 2003, v1 1 Problem Statement How do you link information from assessments to individualized instructional recommendations in a DL context? –Content is going online –Assessments are going online –Couple content and assessment

CRESST ONR/NETC Meetings, July 2003, v1 2 Ontologies An ontology is a conceptual representation of a domain expressed in terms of concepts and the relationships among the concepts –Support knowledge capture, representation, sharing –Fielded technology Medical, engineering, e-commerce, military… Development tools and APIs available today (e.g., Protégé) Support rapid development and testing of prototypes

CRESST ONR/NETC Meetings, July 2003, v1 3 Bayesian Networks Graphical modeling of the causal structure of a phenomenon in terms of nodes and relations –Nodes represent states, links represent the influence relations –Supports fusion of observable data (e.g., “correct on item 1”) into high-level hypotheses (e.g., “understands breath control”) –Fielded technology Development tools available (HUGIN, MSBNx) ONR-funded research

CRESST ONR/NETC Meetings, July 2003, v1 4 Assessment Application Example Content recommendation –Deliver individualized instructional content based on assessment results Approach –Use a domain ontology to represent content –Use assessments to measure students’ knowledge of the domain –Use a Bayesian network to model knowledge dependencies

CRESST ONR/NETC Meetings, July 2003, v1 5 Rifle Marksmanship Ontology Capture hierarchical structure of a domain –Field manuals, doctrine, training videos –Bind content to structure (text, video, graphics) Capture conceptual representation –Experts (coaches, snipers, rifle team) –Upper-level ontology captured using knowledge maps

CRESST ONR/NETC Meetings, July 2003, v1 6 Based on corpus of marksmanship literature and doctrine Currently 168 concepts (classes) Content directly bound to each node –Important if you want to make use of the information Hierarchical Representation

CRESST ONR/NETC Meetings, July 2003, v1 7 Binding Content to Structure

CRESST ONR/NETC Meetings, July 2003, v1 8 Application of Ontology Marksmanship ontology serves as testbed to evaluate feasibility of approach –Pilot test of approach — 2nd Lts. undergoing entry-level marksmanship training –Design Individualized content recommendation vs. control (no recommendation) –Examine shooting outcome, learning outcomes, changes in BN due to instruction, Marines’ perceptions of learning

CRESST ONR/NETC Meetings, July 2003, v1 9 Linking Assessment and Instruction Approach –Depict knowledge dependencies among marksmanship concepts using a Bayesian network –Administer assessment to gather information on Marines’ understanding of rifle marksmanship –Take assessment results — item-level data — and update BN

CRESST ONR/NETC Meetings, July 2003, v1 10 Linking Assessment and Instruction Approach (continued) –Identify concepts that have low probabilities in BN — interpreted as poor understanding –Make use of cognitive demands of tasks and items to infer depth of a Marine’s understanding –Deliver different content based on depth of understanding

CRESST ONR/NETC Meetings, July 2003, v1 11 Content Recommendation

CRESST ONR/NETC Meetings, July 2003, v1 12 Example of Feedback

CRESST ONR/NETC Meetings, July 2003, v1 13 Preliminary Results Within-group analyses –BN probabilities increased for concepts that had instructional content served up –BN probabilities did not change for concepts that did not have instructional content –High-level BN topics correlated with measure it was derived from as well as reasoning measure –BN “scores” corresponded with Marines’ self- ratings of their level of knowledge (80% agreement)

CRESST ONR/NETC Meetings, July 2003, v1 14 Preliminary Results Between-group analyses inconclusive –Small sample size (n = 16) –Experimental-condition Marines Qualified in thunderstorm Learned more from classroom training than expected (i.e., > 70% of topics “correct”) Knowledge map scores appear to be increasing at a faster rate than the control group, but differences not statistically significant

CRESST ONR/NETC Meetings, July 2003, v1 15 Summary An important opportunity of online assessments is the potential to measure many aspects of human behavior under a variety of different conditions An important challenge is extracting meaningful information from (potentially) voluminous amounts of data Bayesian networks and ontologies may be one approach to address