BNJ 2.03α Beginner Developer Tutorial Chris H. Meyer (revised by William H. Hsu) Kansas State University KDD Laboratory

Slides:



Advertisements
Similar presentations
Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
Advertisements

CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
1 Chapter 5 Belief Updating in Bayesian Networks Bayesian Networks and Decision Graphs Finn V. Jensen Qunyuan Zhang Division. of Statistical Genomics,
Lauritzen-Spiegelhalter Algorithm
Exact Inference in Bayes Nets
Dynamic Bayesian Networks (DBNs)
For Monday Read chapter 18, sections 1-2 Homework: –Chapter 14, exercise 8 a-d.
Pearl’s Belief Propagation Algorithm Exact answers from tree-structured Bayesian networks Heavily based on slides by: Tomas Singliar,
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Junction Trees: Motivation Standard algorithms (e.g., variable elimination) are inefficient if the undirected graph underlying the Bayes Net contains cycles.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Bayesian network inference
10/24  Exam on 10/26 (Lei Tang and Will Cushing to proctor)
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 8 May 4, 2005
Recent Development on Elimination Ordering Group 1.
Bayesian Networks for Environmental Resource Management Peter Towbin Applied Math and Statistics.
Propagation in Poly Trees Given a Bayesian Network BN = {G, JDP} JDP(a,b,c,d,e) = p(a)*p(b|a)*p(c|e,b)*p(d)*p(e|d) a d b e c.
Belief Propagation, Junction Trees, and Factor Graphs
Constructing Belief Networks: Summary [[Decide on what sorts of queries you are interested in answering –This in turn dictates what factors to model in.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CS 188: Artificial Intelligence Spring 2007 Lecture 14: Bayes Nets III 3/1/2007 Srini Narayanan – ICSI and UC Berkeley.
CS 188: Artificial Intelligence Fall 2006 Lecture 17: Bayes Nets III 10/26/2006 Dan Klein – UC Berkeley.
. Approximate Inference Slides by Nir Friedman. When can we hope to approximate? Two situations: u Highly stochastic distributions “Far” evidence is discarded.
ASP.NET Programming with C# and SQL Server First Edition Chapter 8 Manipulating SQL Server Databases with ASP.NET.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Computing & Information Sciences Kansas State University Lecture 28 of 42 CIS 530 / 730 Artificial Intelligence Lecture 28 of 42 William H. Hsu Department.
Peter Andreae Computer Science Victoria University of Wellington Copyright: Peter Andreae, Victoria University of Wellington Summary and Exam COMP 102.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Computing & Information Sciences Kansas State University Lecture 27 of 42 CIS 530 / 730 Artificial Intelligence Lecture 27 of 42 William H. Hsu Department.
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Michael Baron + * Department of Computer Science, University of Texas at Dallas + Department of Mathematical.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Kansas State University Department of Computing and Information Sciences Kansas State University KDD Lab ( Permutation.
1 Inferring structure to make substantive conclusions: How does it work? Hypothesis testing approaches: Tests on deviances, possibly penalised (AIC/BIC,
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Direct Message Passing for Hybrid Bayesian Networks Wei Sun, PhD Assistant Research Professor SFL, C4I Center, SEOR Dept. George Mason University, 2009.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
Computing & Information Sciences Kansas State University Monday, 29 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 25 of 42 Wednesday, 29 October.
Introduction to Bayesian Networks
ANALYSIS AND IMPLEMENTATION OF GRAPH COLORING ALGORITHMS FOR REGISTER ALLOCATION By, Sumeeth K. C Vasanth K.
Kruskal’s and Dijkstra’s Algorithm.  Kruskal's algorithm is an algorithm in graph theory that finds a minimum spanning tree for a connected weighted.
Learning With Bayesian Networks Markus Kalisch ETH Zürich.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
Computing & Information Sciences Kansas State University Monday, 06 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 31 of 42 Monday, 06 November.
Kansas State University Department of Computing and Information Sciences Laboratory for Knowledge Discovery in Databases (KDD) KDD Group Research Seminar.
Belief Propagation and its Generalizations Shane Oldenburger.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Visual Basic for Application - Microsoft Access 2003 Finishing the application.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Graphical Models of Probability.
Today Graphical Models Representing conditional dependence graphically
Belief propagation with junction trees Presented by Mark Silberstein and Yaniv Hamo.
Computing & Information Sciences Kansas State University Friday, 03 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 30 of 42 Friday, 03 November.
CS 541: Artificial Intelligence Lecture VII: Inference in Bayesian Networks.
Chapter 5. Greedy Algorithms
Inference in Bayesian Networks
Lecture 27 Creating Custom GUIs
Exact Inference Continued
Markov Networks.
Exact Inference ..
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence Fall 2008
Class #16 – Tuesday, October 26
Inference III: Approximate Inference
Lecture 3: Exact Inference in GMs
Markov Networks.
Presentation transcript:

BNJ 2.03α Beginner Developer Tutorial Chris H. Meyer (revised by William H. Hsu) Kansas State University KDD Laboratory

Contents Introduction Inference Tutorial Learning Tutorial Coding the Wizards

BNJ 2.0α Tools Offers many new tools in GUI form This lecture will focus on the Inference and Learning Wizards We will also look at components such as evidence, CPT tables, and algorithms behind learning

Contents Introduction Inference Tutorial Learning Tutorial Coding the Wizards

Starting the Inference Wizard (1) Select Tools  Inference Wizard

Starting the Inference Wizard (2) Load existing network or GUI network You may also select to have an evidence file present

Using the Inference Wizard (1) Exact Inference Methods  LS / Junction Tree  Variable Elimination (elimbel)  Loop Cutset Conditioning  Pearl’s Propagation (tree only)

Using the Inference Wizard (2) L-S Algorithm contains 2 main steps:  Creates a tree of cliques (junction tree) from the Bayesian Network  Computes probability of cliques, then single- node properties are formed based on probability of cliques

Using the Inference Wizard (3) (Example of Cliques in L-S algorithm) Courtesy of Haipeng Guo

Using the Inference Wizard (4) Variable Elimination  Uses confactors and the VE algorithm instead of trees Loop Cutset  Finds minimum cutsets of probability in Bayesian Networks and computes probability from the cutsets

Using the Inference Wizard (5) Pearl’s Propagation  Uses message-passing as data from 1 vertex propagates to all neighbors, then to neighbor’s neighbors, etc… All algorithms are useful in the correct circumstance, but full explanation is not in the scope of this lecture

Using the Inference Wizard (6) Approximate Inference  Logic Sampling, Forward Sampling, Likelihood Weighting, Self-Importance Sampling, Adaptive-Importance Sampling, Pearl MCMC Method, Chavez MCMC Method Again, all algorithms are useful in the correct circumstance Output your data to chosen file on completion

Contents Introduction Inference Tutorial Learning Tutorial Coding the Wizards

Starting the Learning Wizard (1) Select Tools  Learning Wizard

Starting the Learning Wizard (2) Load Local File Or Database File

Using the Learning Wizard (1) Select your Learning Algorithm  K2, Genetic Algorithm Wrapper for K2 (GAWK), Genetic Algorithm on Structure, Greedy Structure Learning, Standard Hill- Climbing, Hill-Climbing with adversarial reweighting, Hill-Climbing with Dirichlet prior, Simulated Annealing, Stochastic structural learning

Using the Learning Wizard (2) Depending on user’s desire, different learning algorithms will prove to be more effective Output results to file with desired ordering

Contents Introduction Inference Tutorial Learning Tutorial Coding the Wizards

Inference Wizard Coding (1) GUI  Main GUI window passed as owner of InferenceWizard.java  All Buttons, JButtons, JRadioButtons, etc. are added to InferenceWizard.java’s ActionListener(actionEvent e) method

Inference Wizard Coding (2) Secondary Windows  Built using BNJFileDialogFactory class GUI = First 500 lines of code for InferenceWizard.java Main Brain = Last 100 lines of code

Inference Wizard Coding (3) Approximate Inference is slightly more complicated 

Inference Wizard Coding (4) JTree, DefaultTreeModel, and InferenceResult are used for inference calculation These are found in javax.swing.tree.* and edu.ksu.cis.bnj.bbn.inference.*

Inference Wizard Coding (5) Inference.getMarginals() returns an InferenceResult, which contains keys used for enumeration InferenceWizard parses all nodes, and adds them to a DefaultTreeModel. On completion of parsing, the JTree is updated with this model

Inference Wizard Coding (6) JPanel is created upon model completion for results Check edu.ksu.cis.bnj.bbn.inference.* for algorithm explanations

Coding the Learning Wizard (1) GUI  Almost exactly the same as InferenceWizard.java, except…  Database functionality has been implemented Currently supports Oracle, MySQL, and PostgreSQL

Coding the Learning Wizard (2) Except for Database methods, connect() and disconnect(), the Learning Wizard is basically a mirror of the Inference Wizard GUI = First 400 lines of code for LearningWizard.java Main Brain = Last 30 lines of code

Coding the Learning Wizard (3) Key element of LearningWizard.java = Learner  Imported from edu.ksu.cis.bnj.bbn.learning.* Learner is then instantiated with the selected algorithm and data as parameters  i.e. : learner = Learner.load(learningEngines[selectedAlgorithm][1], data); Or learner = new PRMk2((Database) data, null);

Coding the Learning Wizard (4) Learned graph is returned via Learner.getGraph() This statement returns a BBNGraph suitable for viewing Check edu.ksu.cis.bnj.bbn.learning.* for algorithm explanations