Download presentation
Presentation is loading. Please wait.
Published byGladys Bradford Modified over 9 years ago
1
Kansas State University Department of Computing and Information Sciences Real-Time Bayesian Network Inference for Decision Support in Personnel Management: Report on Research Activities William H. Hsu, Computing and Information Sciences Haipeng Guo, Computing and Information Sciences Shing I Chang, Industrial and Manufacturing Systems Engineering Kansas State University http://groups.yahoo.com/group/onr-mpp This presentation is: http://www.kddresearch.org/KSU/CIS/ONR-2002-Jun-04.ppt
2
Kansas State University Department of Computing and Information Sciences Overview Knowledge Discovery in Databases (KDD) –Towards scalable data mining –Applications of KDD: learning and reasoning Building Causal Models for Decision Support Time Series and Model Integration –Prognostic (prediction and monitoring) applications –Crisis monitoring and simulation –Anomaly, intrusion, fraud detection –Web log analysis –Applying high-performance neural, genetic, Bayesian computation Information Retrieval: Document Categorization, Text Mining –Business intelligence applications (e.g., patents) –“Web mining”: dynamic indexing and document analysis High-Performance KDD Program at K-State
3
Kansas State University Department of Computing and Information Sciences High-Performance Database Mining and KDD: Current Research Programs at K-State Laboratory for Knowledge Discovery in Databases (KDD) –Research emphases: machine learning, reasoning under uncertainty –Applications Decision support Digital libraries and information retrieval Remote sensing, robot vision and control Human-Computer Interaction (HCI) - e.g., simulation-based training Computational science and engineering (CSE) Curriculum and Research Development –Real-time automated reasoning (inference) –Machine learning –Probabilistic models for multi-objective optimization –Intelligent displays: visualization of diagrammatic models –Knowledge-based expert systems, data modeling for KDD
4
Kansas State University Department of Computing and Information Sciences Stages of Data Mining and Knowledge Discovery in Databases
5
Kansas State University Department of Computing and Information Sciences Visual Programming: Java-Based Software Development Platform D2K © 2002 National Center for Supercomputing Applications (NCSA) Used with permission.
6
Kansas State University Department of Computing and Information Sciences X1X1 X2X2 X3X3 X4X4 Season: Spring Summer Fall Winter Sprinkler: On, Off Rain: None, Drizzle, Steady, Downpour Ground: Wet, Dry X5X5 Ground: Slippery, Not-Slippery P(Summer, Off, Drizzle, Wet, Not-Slippery) = P(S) · P(O | S) · P(D | S) · P(W | O, D) · P(N | W) Conditional Independence –X is conditionally independent (CI) from Y given Z (sometimes written X Y | Z) iff P(X | Y, Z) = P(X | Z) for all values of X, Y, and Z –Example: P(Thunder | Rain, Lightning) = P(Thunder | Lightning) T R | L Bayesian Network –Directed graph model of conditional dependence assertions (or CI assumptions) –Vertices (nodes): denote events (each a random variable) –Edges (arcs, links): denote conditional dependencies General Product (Chain) Rule for BBNs Example (“Sprinkler” BBN) Bayesian Belief Networks (BBNS): Definition
7
Kansas State University Department of Computing and Information Sciences Bayesian Networks and Recommender Systems Current Research –Efficient BBN inference (parallel, multi-threaded Lauritzen-Spiegelhalter in D2K) –Hybrid quantitative and qualitative inference (“simulation”) –Continuous variables and hybrid (discrete/continuous) BBNs –Induction of hidden variables –Local structure: localized constraints and assumptions, e.g., Noisy-OR BBNs –Online learning Incrementality (aka lifelong, situated, in vivo learning) Ability to change network structure during inferential process –Polytree structure learning (tree decomposition): alternatives to Chow-Liu –Complexity of learning, inference in restricted classes of BBNs Future Work –Decision networks aka influence diagrams (BBN + utility) –Anytime / real-time BBN inference for time-constrained decision support –Some temporal models: Dynamic Bayesian Networks (DBNs)
8
Kansas State University Department of Computing and Information Sciences Data Mining: Development Cycle Model Identification –Queries: classification, assignment –Specification of data model –Grouping of attributes by type Prediction Objective Identification –Assignment specification –Identification of metrics Reduction –Refinement of data model –Selection of relevant data (quantitative, qualitative) Synthesis: New Attributes Integration: Multiple Data Sources (e.g., Enlisted Master File, Surveys) Environment (Data Model) Learning Element Knowledge Base Decision Support System
9
Kansas State University Department of Computing and Information Sciences Learning Bayesian Networks: Gradient Ascent Algorithm Train-BN (D) –Let w ijk denote one entry in the CPT for variable Y i in the network w ijk = P(Y i = y ij | parents(Y i ) = ) e.g., if Y i Campfire, then (for example) u ik –WHILE termination condition not met DO// perform gradient ascent Update all CPT entries w ijk using training data D Renormalize w ijk to assure invariants: Applying Train-BN –Learns CPT values –Useful in case of known structure –Key problems: learning structure from data, approximate inference Bus TourGroup Storm LightningCampfire ForestFireThunder
10
Kansas State University Department of Computing and Information Sciences General-Case BBN Structure Learning: Use Inference to Compute Scores Recall: Bayesian Inference aka Bayesian Reasoning –Assumption: h H are mutually exclusive and exhaustive –Optimal strategy: combine predictions of hypotheses in proportion to likelihood Compute conditional probability of hypothesis h given observed data D i.e., compute expectation over unknown h for unseen cases Let h structure, parameters CPTs Scores for Learning Structure: The Role of Inference Posterior ScoreMarginal Likelihood Prior over StructuresLikelihood Prior over Parameters
11
Kansas State University Department of Computing and Information Sciences Learning Structure: K2 Algorithm and ALARM Algorithm Learn-BBN-Structure-K2 (D, Max-Parents) FOR i 1 to n DO// arbitrary ordering of variables {x 1, x 2, …, x n } WHILE (Parents[x i ].Size < Max-Parents) DO// find best candidate parent Best argmax j>i (P(D | x j Parents[x i ])// max Dirichlet score IF (Parents[x i ] + Best).Score > Parents[x i ].Score) THEN Parents[x i ] += Best RETURN ({Parents [x i ] | i {1, 2, …, n}}) A Logical Alarm Reduction Mechanism [Beinlich et al, 1989] –BBN model for patient monitoring in surgical anesthesia –Vertices (37): findings (e.g., esophageal intubation), intermediates, observables –K2: found BBN different in only 1 edge from gold standard (elicited from expert)
12
Kansas State University Department of Computing and Information Sciences Major Software Releases, FY 2002 Bayesian Network Tools in Java (BNJ) –v1.0a released Wed 08 May 2002 to www.Sourceforge.netwww.Sourceforge.net –Key features Standardized data format (XML) Existing algorithms: inference, structure learning, data generation –Experimental results Improved structure learning using K2, inference-based validation Adaptive importance sampling (AIS) inference competitive with best published algorithms Machine Learning in Java (MLJ) –v1.0a released Fri 10 May 2002 to www.Sourceforge.netwww.Sourceforge.net –Key features: (3) inductive learning algorithms from MLC++, (2) inductive learning wrappers (1 from MLC++, 1 from GA literature) –Experimental results Genetic wrappers for feature subset selection: Jenesis, MLJ-CHC Overfitting control in supervised inductive learning for classification
13
Kansas State University Department of Computing and Information Sciences About BNJ –v1.0a, 08 May 2002: 26000+ lines of Java code, GNU Public License (GPL) –http://www.kddresearch.org/Groups/Probabilistic-Reasoning/BNJhttp://www.kddresearch.org/Groups/Probabilistic-Reasoning/BNJ –Key features [Perry, Stilson, Guo, Hsu, 2002] XML BN Interchange Format (XBN) converter – to serve 7 client formats (MSBN, Hugin, SPI, IDEAL, Ergo, TETRAD, Bayesware) Full exact inference: Lauritzen-Spiegelhalter (Hugin) algorithm Five (5) importance sampling algorithms: forward simulation (likelihood weighting) [Shachter and Peot, 1990], probabilistic logic sampling [Henrion, 1986], backward sampling [Fung and del Favero, 1995] self- importance sampling [Shachter and Peot, 1990], adaptive importance sampling [Cheng and Druzdzel, 2000] Data generator Published Research with Applications to Personnel Science –Recent work GA for improved structure learning: results in [HGPS02a; HGPS02b] Real-time inference framework – multifractal analysis [GH02b] –Current work: prediction – migration trends (EMF); Sparse Candidate –Planned continuation: (dynamic) decision networks; continuous BNs Bayesian Network Tools in Java (BNJ)
14
Kansas State University Department of Computing and Information Sciences Change of Representation and Inductive Bias Control [B] Representation Evaluator for Learning Problems D: Training Data : Inference Specification D train (Inductive Learning) D val (Inference) [A] Genetic Algorithm Optimized Representation α Candidate Representation f(α) Representation Fitness GA for BN Structure Learning [Hsu, Guo, Perry, Stilson, GECCO-2002]
15
Kansas State University Department of Computing and Information Sciences [B] Representation Evaluator for Input Specifications : Evidence Specification D train (Model Training) D val (Model Validation by Inference) f(α) Specification Fitness (Inferential Loss) [ii] Validation (Measurement of Inferential Loss) h Hypothesis [i] Inductive Learning (Parameter Estimation from Training Data) α Candidate Input Specification Model-Based Validation [Hsu, Guo, Perry, Stilson, GECCO-2002]
16
Kansas State University Department of Computing and Information Sciences BNJ: Integrated Tool for Bayesian Network Learning and Inference XML Bayesian Network Learned from Data using K2 in BNJ
17
Kansas State University Department of Computing and Information Sciences About MLJ –v1.0a, 10 May 2002: 24000+ lines of Java code, GNU Public License (GPL) –http://www.kddresearch.org/Groups/Machine-Learning/MLJhttp://www.kddresearch.org/Groups/Machine-Learning/MLJ –Key features [Hsu, Schmidt, Louis, 2002] Conformant to MLC++ input-output specification Three (3) inductive learning algorithms: ID3, C4.5, discrete Naïve Bayes Two (2) wrapper inducers: feature subset selection [Kohavi and John, 1997], CHC [Eshelman, 1990; Guerra-Salcedo and Whitley, 1999] Published Research with Applications to Personnel Science –Recent work Multi-agent learning [GH01, GH02a] Genetic feature selection wrappers [HSL02, HWRC02, HS02] –Current work: WEKA compatibility, parallel online continuous arcing –Planned continuations New inducers: instance-based (k-nearest-neighbor), sequential rule covering, feedforward artificial neural network (multi-layer perceptron) New wrappers: theory-guided constructive induction, boosting (Arc-x4, AdaBoost.M1, POCA) Integration of reinforcement learning (RL) inducers Machine Learning in Java (MLJ)
18
Kansas State University Department of Computing and Information Sciences Infrastructure for High-Performance Computation in Data Mining Rapid KDD Development Environment: Operational Overview
19
Kansas State University Department of Computing and Information Sciences National Center for Supercomputing Applications (NCSA) D2K
20
Kansas State University Department of Computing and Information Sciences Visual Programming Interface (Java): Parallel Genetic Algorithms
21
Kansas State University Department of Computing and Information Sciences Time Series Modeling and Prediction: Integration with Information Visualization New Time Series Visualization System (Java3D)
22
Kansas State University Department of Computing and Information Sciences Demographics-Based Clustering for Prediction (Continuing Research) Cluster Formation and Segmentation Algorithm (Sketch) Dimensionality- Reducing Projection (x’) Clusters of Similar Records Delaunay Triangulation Voronoi (Nearest Neighbor) Diagram (y)
23
Kansas State University Department of Computing and Information Sciences Data Clustering in Interactive Real-Time Decision Support 15 × 15 Self-Organizing Map (U-Matrix Output) Cluster Map (Personnel Database)
24
Kansas State University Department of Computing and Information Sciences Laboratory for Knowledge Discovery in Databases (KDD) –Applications: interdisciplinary research programs at K-State, FY 2002 Decision support, optimization (Hsu, CIS; Chang, IMSE) (NSF EPSCoR) Bioinformatics – gene expression modeling (Hsu, CIS; Welch, Agronomy; Roe, Biology; Das, EECE) Digital libraries, info retrieval (Hsu, CIS; Zollman, Physics; Math, Art) Human-Computer Interaction (HCI) - e.g., simulation-based training Curriculum Development –Real-time intelligent systems (Chang, Hsu, Neilsen, Singh) –Machine learning and artificial intelligence; info visualization (Hsu) –Other: bioinformatics, digital libraries, robotics, DBMS Research Partnerships –NCSA: National Computational Science Alliance, National Center for Supercomputing Applications –Defense (ONR, ARL, DARPA), Industry (Raytheon) Publications, More Info: http://www.kddresearch.orghttp://www.kddresearch.org Summary: State of High-Performance KDD at KSU-CIS
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.