A Generalized Version Space Learning Algorithm for Noisy and Uncertain Data T.-P. Hong, S.-S. Tseng IEEE Transactions on Knowledge and Data Engineering,

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Advertisements

2. Concept Learning 2.1 Introduction
The 20th International Conference on Software Engineering and Knowledge Engineering (SEKE2008) Department of Electrical and Computer Engineering
FT228/4 Knowledge Based Decision Support Systems
Set Based Search Modeling Examples II
1 Inferences with Uncertainty Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson Copyright 1998, Prentice Hall, Upper Saddle.
Learning from Examples Adriano Cruz ©2004 NCE/UFRJ e IM/UFRJ.
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Dynamic Programming.
Concept Learning DefinitionsDefinitions Search Space and General-Specific OrderingSearch Space and General-Specific Ordering The Candidate Elimination.
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
GIS Error and Uncertainty Longley et al., chs. 6 (and 15) Sources: Berry online text, Dawn Wright.
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Decision Tree Algorithm
Northwestern University Winter 2007 Machine Learning EECS Machine Learning Lecture 13: Computational Learning Theory.
An Introduction to Machine Learning In the area of AI (earlier) machine learning took a back seat to Expert Systems Expert system development usually consists.
Multiple-Instance Learning Paper 1: A Framework for Multiple-Instance Learning [Maron and Lozano-Perez, 1998] Paper 2: EM-DD: An Improved Multiple-Instance.
Machine Learning: Symbol-Based
WELCOME TO THE WORLD OF FUZZY SYSTEMS. DEFINITION Fuzzy logic is a superset of conventional (Boolean) logic that has been extended to handle the concept.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
1 Synthesizing High-Frequency Rules from Different Data Sources Xindong Wu and Shichao Zhang IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, VOL.
1 PSO-based Motion Fuzzy Controller Design for Mobile Robots Master : Juing-Shian Chiou Student : Yu-Chia Hu( 胡育嘉 ) PPT : 100% 製作 International Journal.
Artificial Intelligence 6. Machine Learning, Version Space Method
A Simple Method to Extract Fuzzy Rules by Measure of Fuzziness Jieh-Ren Chang Nai-Jian Wang.
Short Introduction to Machine Learning Instructor: Rada Mihalcea.
On Applications of Rough Sets theory to Knowledge Discovery Frida Coaquira UNIVERSITY OF PUERTO RICO MAYAGÜEZ CAMPUS
Learning CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Are we still talking about diversity in classifier ensembles? Ludmila I Kuncheva School of Computer Science Bangor University, UK.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
1 Machine Learning What is learning?. 2 Machine Learning What is learning? “That is what learning is. You suddenly understand something you've understood.
Machine Learning Chapter 11.
Fundamentals of machine learning 1 Types of machine learning In-sample and out-of-sample errors Version space VC dimension.
Data Mining Association Analysis: Basic Concepts and Algorithms Lecture Notes for Chapter 6 Introduction to Data Mining By Tan, Steinbach, Kumar Lecture.
1 Integration of Neural Network and Fuzzy system for Stock Price Prediction Student : Dah-Sheng Lee Professor: Hahn-Ming Lee Date:5 December 2003.
Department of Computer Science City University of Hong Kong Department of Computer Science City University of Hong Kong 1 Probabilistic Continuous Update.
Mining High Utility Itemset in Big Data
1 Optimal Cycle Vida Movahedi Elder Lab, January 2008.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
updated CmpE 583 Fall 2008 Ontology Integration- 1 CmpE 583- Web Semantics: Theory and Practice ONTOLOGY INTEGRATION Atilla ELÇİ Computer.
Fuzzy Genetic Algorithm
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Hierarchical Quorum Consensus: A New Algorithm for Managing Replicated Data Akhil Kumar IEEE TRANSACTION ON COMPUTERS, VOL.40, NO.9, SEPTEMBER 1991.
Expert Systems with Applications 34 (2008) 459–468 Multi-level fuzzy mining with multiple minimum supports Yeong-Chyi Lee, Tzung-Pei Hong, Tien-Chin Wang.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Machine Learning: Symbol-Based Luger: Artificial.
VTS 2012: Zhao-Agrawal1 Net Diagnosis using Stuck-at and Transition Fault Models Lixing Zhao* Vishwani D. Agrawal Department of Electrical and Computer.
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
Concept Learning and the General-to-Specific Ordering 이 종우 자연언어처리연구실.
Overview Concept Learning Representation Inductive Learning Hypothesis
Machine Learning A Quick look Sources: Artificial Intelligence – Russell & Norvig Artifical Intelligence - Luger By: Héctor Muñoz-Avila.
Machine Learning Concept Learning General-to Specific Ordering
Paper on “Abduction using Neural Models” for the Course “Intelligent Diagnostics” at UCF. Fall ‘02 Abduction Using Neural Models by Madan Bharadwaj Instructor:
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 24 of 41 Monday, 18 October.
Concept Learning and The General-To Specific Ordering
Computational Learning Theory Part 1: Preliminaries 1.
 Effective Multi-Label Active Learning for Text Classification Bishan yang, Juan-Tao Sun, Tengjiao Wang, Zheng Chen KDD’ 09 Supervisor: Koh Jia-Ling Presenter:
Generalized Point Based Value Iteration for Interactive POMDPs Prashant Doshi Dept. of Computer Science and AI Institute University of Georgia
Chap. 10 Learning Sets of Rules 박성배 서울대학교 컴퓨터공학과.
Mining Concept-Drifting Data Streams Using Ensemble Classifiers Haixun Wang Wei Fan Philip S. YU Jiawei Han Proc. 9 th ACM SIGKDD Internal Conf. Knowledge.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
CS623: Introduction to Computing with Neural Nets (lecture-18) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Machine Learning: Symbol-Based
Machine Learning: Symbol-Based
CS 9633 Machine Learning Concept Learning
Il-Kyoung Kwon1, Sang-Yong Lee2
Ordering of Hypothesis Space
Dr Arfan Jaffar Genetic Algorithm and SOM based Fuzzy Hybrid Intelligent Method for Color Image Segmentation Research Seminar.
Machine Learning Chapter 2
Implementation of Learning Systems
Area Coverage Problem Optimization by (local) Search
Machine Learning Chapter 2
Presentation transcript:

A Generalized Version Space Learning Algorithm for Noisy and Uncertain Data T.-P. Hong, S.-S. Tseng IEEE Transactions on Knowledge and Data Engineering, Vol. 9, No. 2, 임희웅

Introduction Generalized learning strategy of VS Noisy & uncertain training data Searching & pruning Trade-off between including positive training instances and excluding negative ones Trade-off between computational time consumed and the accuracy by pruning factors

New Definition of S/G Addition Information : Count Sum of positive/negative information implicit in the training instances presented so far. S/G boundary S A set of the first i maximally consistent hypotheses. No other hypothesis in S exists which is both more specific than another and has equal or larger count. G A set of the first j maximally consistent hypotheses. No other hypothesis in G exists which is both more general than another and has equal or larger count.

FIPI Factor of Including Positive Instances Trade-off between including positive training instances vs. excluding negative ones 0~1, real number 0: only to include positive training example 1: only to exclude negative training example 0.5: same importance

Certainty Factor (CF) A measure for positiveness -1~1, real number -1: negative example 1: positive example In case of new training example of CF S  (1+CF)/2 positive example G  (1-CF)/2 negative example

Learning Process Searching & Pruning Searching Generate and collects possible candidates into a large set Pruning Prune above set according to the degree of consistency of the hypotheses

Learning Process

Input & Output Input A set of n training instances each with CF FIPI i: the max # of hypotheses in S J: the max # of hypotheses in G Output The hypotheses in sets S and G that are maximally consistent with the training instances.

Step 1 & 2 Step 1 Initialize S= , & G= with count 0 Step 2 For each training instance with uncertainty CF, do Step 3 to Step 7.

Step 3 – Search 1 Generalize/Specialize each hypothesis in S/G c k : count of hypothesis in S/G Attach new count c k +(1+CF)/2 / c k +(1-CF)/2  S ’ /G ’

Step 4 – Search 2 Find the set S ” /G ” Which Include/exclude only the new training instance itself Set the count of each hypothesis in S ” /G ” to be (1+CF)/2 / (1-CF)/2

Step 5 – Pruning 1 Combine S/G, S ’ /G ’, and S ” /G ” Identical hypotheses only with maximum count is retained If a particular hypothesis is both more general/specific than another and has an equal or smaller count, discard that.

Step 6 – Confidence Calc. Confidence of each new hypothesis For each hypothesis s with count c s in the new S Find the hypothesis g in the new G that is more general than s and has the maximum count c g Confidence = FIPI  c s + (1-FIPI)  c g For each hypothesis g with count c g in the new G Do the same.

s (count=c s ), … specific general S g (count=c g ), … G g is more general than s Confidence of s = FIPI  c s + (1-FIPI)  max(c g ) Confidence of g = FIPI  c s + (1-FIPI)  max(c g )

Step 7 – Pruning 2 Select only i/j hypotheses with highest confidence in the new S/G

Another Papers GA L. De Raedt, et al., “ A Unifying Framework for Concept- Learning Algorithms ”, Knowledge Engineering Rev., vol. 7, no. 3, 1989 R. G. Reynolds, et al., “ The Use of Version Space Controlled Genetic Algorithms to Solve the Boole Problem ”. Int ’ l J. Artificial Intelligence Tools, vol. 2, no. 2, 1993 Fuzzy C. C. Lee, “ Fuzzy Logic in Control Systems: Fuzzy Logic Controller Part1&2 ”, IEEE Trans. Systems, Man, and Cybernetics, vol. 20, no. 2, 1990 L. X. Wang, et al., “ Generating Fuzzy Rules by Learning from Examples ”, Proc. IEEE Conf. Fuzzy Systems, 1992