Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University.

Similar presentations


Presentation on theme: "1 Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University."— Presentation transcript:

1 1 Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University

2 2 Overview l Representing uncertainty l Introduction to Bayesian Networks »Syntax, semantics, examples l The knowledge engineering process l Case Studies »Seabreeze prediction »Intelligent Tutoring l Open research questions

3 3 Sources of Uncertainty l Ignorance l Inexact observations l Non-determinism l AI representations »Probability theory »Dempster-Shafer »Fuzzy logic

4 4 Probability theory for representing uncertainty l Assigns a numerical degree of belief between 0 and 1 to facts »e.g. “it will rain today” is T/F. »P(“it will rain today”) = 0.2 prior probability (unconditional) l Posterior probability (conditional) »P(“it wil rain today” | “rain is forecast”) = 0.8 l Bayes’ Rule: P(H|E) = P(E|H) x P(H) P(E)

5 5 Bayesian networks l Directed acyclic graphs l Nodes: random variables, »R: “it is raining”, discrete values T/F »T: temperature, cts or discrete variable »C: colour, discrete values {red,blue,green} l Arcs indicate dependencies (can have causal interpretation)

6 6 Bayesian networks l Conditional Probability Distribution (CPD) –Associated with each variable –probability of each state given parent states “Jane has the flu” “Jane has a high temp” “Thermometer temp reading” X Flu Y Te Q Th Models causal relationship Models possible sensor error P(Flu=T) = 0.05 P(Te=High|Flu=T) = 0.4 P(Te=High|Flu=F) = 0.01 P(Th=High|Te=H) = 0.95 P(Th=High|Te=L) = 0.1

7 7 BN inference l Evidence: observation of specific state l Task: compute the posterior probabilities for query node(s) given evidence. Th Y Flu Te Diagnostic inference Th Flu Y Te Causal inference Intercausal inference Te FluTB Flu Mixed inference Th Flu Te

8 8 BN software l Commerical packages: Netica, Hugin, Analytica (all with demo versions) l Free software: Smile, Genie, JavaBayes, … http://HTTP.CS.Berkeley.EDU/~murphyk/Bayes/bnsoft. html l Examples

9 9 Decision networks l Extension to basic BN for decision making »Decision nodes »Utility nodes l EU(Action) =  p(o|Action,E) U(o) o »choose action with highest expect utility l Example

10 10 Elicitation from experts l Variables »important variables? values/states? l Structure »causal relationships? »dependencies/independencies? l Parameters (probabilities) »quantify relationships and interactions? l Preferences (utilities)

11 11 Expert Elicitation Process l These stages are done iteratively l Stops when further expert input is no longer cost effective l Process is difficult and time consuming. l Current BN tools »inference engine »GUI l Next generation of BN tools? BN EXPERT BN TOOLS Domain EXPERT

12 12 Knowledge discovery l There is much interest in automated methods for learning BNS from data »parameters, structure (causal discovery) l Computationally complex problem, so current methods have practical limitations »e.g. limit number of states, require variable ordering constraints, do not specify all arc directions l Evaluation methods

13 13 The knowledge engineering process 1. Building the BN »variables, structure, parameters, preferences »combination of expert elicitation and knowledge discovery 2. Validation/Evaluation »case-based, sensitivity analysis, accuracy testing 3. Field Testing »alpha/beta testing, acceptance testing 4. Industrial Use »collection of statistics 5. Refinement »Updating procedures, regression testing

14 14 Case Study: Intelligent tutoring l Tutoring domain: primary and secondary school students’ misconceptions about decimals l Based on Decimal Comparison Test (DCT) »student asked to choose the larger of pairs of decimals »different types of pairs reveal different misconceptions l ITS System involves computer games involving decimals l This research also looks at a combination of expert elicitation and automated methods

15 15 Expert classification of Decimal Comparison Test (DCT) results

16 16 The ITS architecture Adaptive Bayesian Network Decimal comparison test (optional) Inputs Computer Games Generic BN model of student Information about student e.g. age (optional) Hidden number Flying photographer Decimaliens …. Number between Student Item Answer Item Answer Classroom diagnostic test results (optional) Classroom Teaching Activities Report on student Answer Item type New game  Diagnose misconception  Predict outcomes  Identify most useful information Sequencing tactics  Select next item type  Decide to present help  Decide change to new game  Identify when expertise gained Teacher System Controller Module Answers Help Feedback Help

17 17 Expert Elicitation l Variables »two classification nodes: fine and coarse (mut. ex.) »item types: (i) H/M/L (ii) 0-N l Structure »arcs from classification to item type »item types independent given classification l Parameters »careless mistake (3 different values) »expert ignorance: - in table (uniform distribution)

18 18 Expert Elicited BN

19 19 Evaluation process l Case-based evaluation »experts checked individual cases »sometimes, if prior was low, ‘true’ classification did not have highest posterior (but usually had biggest change in ratio) l Adaptiveness evaluation »priors changes after each set of evidence l Comparison evaluation »Differences in classification between BN and expert rule »Differences in predictions between different BNs

20 20 Comparison evaluation l Development of measure: same classification, desirable and undesirable re-classification l Use item type predictions l Investigation of effect of item type granularity and probability of careless mistake

21 21 Investigation by Automated methods l Classification (using SNOB program, based on MML) l Parameters l Structure (using CaMML)

22 22 Results

23 23 Case Study: Seabreeze prediction l 2000 Honours project, joint with Bureau of Meteorology (PAKDD’2001 paper, TR) l BN network built based on existing simple expert rule l Several years data available for Sydney seabreezes l CaMML and Tetrad-II programs used to learn BNs from data l Comparative analysis showed automated methods gave improved predictions.

24 24 Open Research Questions l Tools needed to support expert elicitation l Combining expert elicitation and automated methods l Evaluation measures and methods l Industry adoption of BN technology


Download ppt "1 Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University."

Similar presentations


Ads by Google