Knowledge Engineering Week 3 Video 5. Knowledge Engineering  Where your model is created by a smart human being, rather than an exhaustive computer.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

What is a CAT?. Introduction COMPUTER ADAPTIVE TEST + performance task.
Educational Data Mining Overview Ryan S.J.d. Baker PSLC Summer School 2012.
Data Mining Methodology 1. Why have a Methodology  Don’t want to learn things that aren’t true May not represent any underlying reality ○ Spurious correlation.
Knowledge Inference: Advanced BKT Week 4 Video 5.
Chapter 10 Decision Making © 2013 by Nelson Education.
Meta-Cognition, Motivation, and Affect PSY504 Spring term, 2011 January 26, 2010.
Discovery with Models Week 8 Video 1. Discovery with Models: The Big Idea  A model of a phenomenon is developed  Via  Prediction  Clustering  Knowledge.
Data Synchronization and Grain-Sizes Week 3 Video 2.
Supporting (aspects of) self- directed learning with Cognitive Tutors Ken Koedinger CMU Director of Pittsburgh Science of Learning Center Human-Computer.
Ground Truth for Behavior Detection Week 3 Video 1.
Week 8 Video 4 Hidden Markov Models.
Individualized Rating Scales (IRS)
Validity, Sampling & Experimental Control Psych 231: Research Methods in Psychology.
SE 450 Software Processes & Product Metrics Reliability: An Introduction.
Student simulation and evaluation DOD meeting Hua Ai 03/03/2006.
SE 450 Software Processes & Product Metrics Reliability Engineering.
Writing Good Software Engineering Research Papers A Paper by Mary Shaw In Proceedings of the 25th International Conference on Software Engineering (ICSE),
Personality, 9e Jerry M. Burger
CLT Conference Heerlen Ron Salden, Ken Koedinger, Vincent Aleven, & Bruce McLaren (Carnegie Mellon University, Pittsburgh, USA) Does Cognitive Load Theory.
Swami NatarajanJuly 14, 2015 RIT Software Engineering Reliability: Introduction.
Educational Data Mining Overview John Stamper PSLC Summer School /25/2011 1PSLC Summer School 2011.
Educational Data Mining and DataShop John Stamper Carnegie Mellon University 1 9/12/2012 PSLC Corporate Partner Meeting 2012.
Tutoring and Learning: Keeping in Step David Wood Learning Sciences Research Institute: University of Nottingham.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 24 Slide 1 Critical Systems Validation 1.
RSBM Business School Research in the real world: the users dilemma Dr Gill Green.
Estimation and Hypothesis Testing Now the real fun begins.
Classifiers, Part 1 Week 1, video 3:. Prediction  Develop a model which can infer a single aspect of the data (predicted variable) from some combination.
Predictive Evaluation
Case Study – San Pedro Week 1, Video 6. Case Study of Classification  San Pedro, M.O.Z., Baker, R.S.J.d., Bowers, A.J., Heffernan, N.T. (2013) Predicting.
Data Annotation for Classification. Prediction Develop a model which can infer a single aspect of the data (predicted variable) from some combination.
The Field Guide to Human Error Investigations- The Old View (Chapters 1 – 6) By Dekker AST 425.
Meta-Cognition, Motivation, and Affect PSY504 Spring term, 2011 January 13, 2010.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Evaluation of software engineering. Software engineering research : Research in SE aims to achieve two main goals: 1) To increase the knowledge about.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Cognitive Architectures: A Way Forward for the Psychology of Programming Michael Hansen Ph.D Student in Comp/Cog Sci CREST / Percepts & Concepts Lab Indiana.
Database Design and Management CPTG /23/2015Chapter 12 of 38 Functions of a Database Store data Store data School: student records, class schedules,
Educational Data Mining: Discovery with Models Ryan S.J.d. Baker PSLC/HCII Carnegie Mellon University Ken Koedinger CMU Director of PSLC Professor of Human-Computer.
Learning Analytics: Process & Theory March 24, 2014.
Introduction to Scientific Research. Science Vs. Belief Belief is knowing something without needing evidence. Eg. The Jewish, Islamic and Christian belief.
©2005, Pearson Education/Prentice Hall CHAPTER 1 Goals and Methods of Science.
Core Methods in Educational Data Mining HUDK4050 Fall 2015.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Core Methods in Educational Data Mining HUDK4050 Fall 2015.
Feature Engineering Studio September 9, Welcome to Feature Engineering Studio Design studio-style course teaching how to distill and engineer features.
Special Topics in Educational Data Mining HUDK5199 Spring term, 2013 March 6, 2013.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M13 8/20/2001Slide 1 SMU CSE 8314 /
Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M00 - Version 7.09 SMU CSE 8314 Software Measurement.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Advanced Methods and Analysis for the Learning and Social Sciences PSY505 Spring term, 2012 January 25, 2012.
WRITING INSTRUCTIONAL OBJECTIVES SEED 4123/5123 and Ready for Science PD Series.
The Robot Revolution has been Postponed (until we can debug the sensors) Bill Smart Oregon State University
What is a CAT? What is a CAT?.
3.5 General feeling that knowledge of hydrology has improved … but more is needed.
Teacher learning: The key to improving the world!
Core Methods in Educational Data Mining
Special Topics in Educational Data Mining
Mario Sadikaj Basic Network Marketing Info All Marketers Should Know
Big Data, Education, and Society
Teaching with Instructional Software
Big Data, Education, and Society
Flu and big data Week 10.2.
Mentimeter A How To Guide.
Core Methods in Educational Data Mining
Remember a hypothesis is an educated guess that predicts the outcome of something like an experiment.
Chapter 4 Summary.
Core Methods in Educational Data Mining
Presentation transcript:

Knowledge Engineering Week 3 Video 5

Knowledge Engineering  Where your model is created by a smart human being, rather than an exhaustive computer

Knowledge Engineering  Also called  Rational modeling  Cognitive modeling

Knowledge Engineering at its best  Knowledge engineering is the art of a human being  Becoming deeply familiar with the target construct  Carefully studying the data, including possibly process data (such as think-alouds)  Understanding the relevant theory and how it applies  Thoughtfully crafting an excellent model

Knowledge Engineering at its best  Achieves higher construct validity than data mining  Achieves comparable performance in data  And may even transfer better to new data  I have not actually seen good proof of this, but it seems possible  Especially if a knowledge-engineered model could get at aspects of the construct that are general, while the data-mined models were trapped with finding features specific to a given population or system

Example of excellent knowledge engineering  Aleven et al.’s (2004, 2006) help-seeking model

A prescriptive model of good help- seeking behavior in an online tutor

With a taxonomy of errors in student help-seeking

Developed based on  Thorough study of dozens of scientific articles  Years of experience in designing online learning environments  Intensive study of log files of student interaction with learning system  Plus experience watching kids use educational software in real classrooms

Resultant models  Predictive of student learning (Aleven et al., 2004, 2006) and preparation for future learning (Baker et al., 2011)  Specific aspects of model correlate to data-mined detectors of same constructs, and improve data- mined models if added to them (Roll et al., 2005)

Knowledge Engineering at its worst  Knowledge engineering (and the other terms) are sometimes used to refer to  Someone making up a simple model very quickly  And then calling the resultant construct by a well-known name  And not testing on data in any way  And asserting that their model is the construct, despite having no evidence

Knowledge Engineering at its worst  Achieves poorer construct validity than data mining  Predicts desired constructs poorly, sometimes even worse than chance  Due to over-simplifying a complex construct  Or even failing to match it  Can slow scientific progress by introducing false results  Can hurt student outcomes by intervening at the wrong times

How can you tell if knowledge engineering is bad  If a data mining model is bad  It’s usually relatively easy to identify, from the features, the validation procedure, or the goodness metrics  Telling top-notch knowledge engineering from junk is a little harder  The hard work is in the researcher’s brain, and the process is usually invisible  But… look for very simple models of complex constructs

Whether You Use Knowledge Engineering or Data Mining…  You should be testing your models on data in some fashion  Even if you can’t get a direct measure (training labels)  You can usually get some kind of indirect measure (predicting student learning, for example)

It’s not an either-or…  Feature engineering is very closely related to knowledge engineering  Careful study of a construct will lead to better features and ultimately to better models

It’s not an either-or…  Using feature engineered models as features in data mining models can be a powerful tool

It’s not an either-or…  Knowledge-engineered models sometimes depend on cut-offs or numerical parameters that are hard to determine rationally  These parameters can be empirically fit using data  Some variants of Aleven et al.’s model do this  As does Bayesian Knowledge Tracing (next week)

Next Up  Week 4 – knowledge modeling