IEEE World Conference on Computational Intelligence – WCCI 2010

Slides:



Advertisements
Similar presentations
Automatic Speech Recognition II  Hidden Markov Models  Neural Network.
Advertisements

POSTER TEMPLATE BY: Multi-Sensor Health Diagnosis Using Deep Belief Network Based State Classification Prasanna Tamilselvan.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Simulation Metamodeling using Dynamic Bayesian Networks in Continuous Time Jirka Poropudas (M.Sc.) Aalto University School of Science and Technology Systems.
Firefly Algorithm By Rasool Tavakoli.
Presented by Arshad Jamal, Rajesh Dhania, Vinkal Vishnoi Active hashing and its application to image and text retrieval Yi Zhen, Dit-Yan Yeung, Published.
Randomized k-Coverage Algorithms for Dense Sensor Networks
Neural Networks II CMPUT 466/551 Nilanjan Ray. Outline Radial basis function network Bayesian neural network.
A Similarity Evaluation Technique for Data Mining with Ensemble of Classifiers Seppo Puuronen, Vagan Terziyan International Workshop on Similarity Search.
Modeling the Behavior of the S&P 500 Index Mary Malliaris Loyola University Chicago 10 th IEEE Conference on Artificial Intelligence for Applications.
Multilayer Perceptron Classifier Combination for Identification of Materials on Noisy Soil Science Multispectral Images Fabricio A.
Interactive Image Segmentation of Non-Contiguous Classes using Particle Competition and Cooperation Fabricio Breve São Paulo State University (UNESP)
Abstract - Many interactive image processing approaches are based on semi-supervised learning, which employ both labeled and unlabeled data in its training.
Learning from Imbalanced, Only Positive and Unlabeled Data Yetian Chen
Intrusion Detection Using Neural Networks and Support Vector Machine
Semi-Supervised Learning with Concept Drift using Particle Dynamics applied to Network Intrusion Detection Data Fabricio Breve Institute of Geosciences.
Soft Computing and Expert System Laboratory Indian Institute of Information Technology and Management Gwalior MTech Thesis Fourth Evaluation Fusion of.
Particle Competition and Cooperation to Prevent Error Propagation from Mislabeled Data in Semi- Supervised Learning Fabricio Breve 1,2
A Sweeper Scheme for Localization and Mobility Prediction in Underwater Acoustic Sensor Networks K. T. DharanC. Srimathi*Soo-Hyun Park VIT University Vellore,
Combined Active and Semi-Supervised Learning using Particle Walking Temporal Dynamics Fabricio Department of Statistics, Applied.
Particle Competition and Cooperation for Uncovering Network Overlap Community Structure Fabricio A. Breve 1, Liang Zhao 1, Marcos G. Quiles 2, Witold Pedrycz.
Page 1 Ming Ji Department of Computer Science University of Illinois at Urbana-Champaign.
Directed-Graph Epidemiological Models of Computer Viruses Presented by: (Kelvin) Weiguo Jin “… (we) adapt the techniques of mathematical epidemiology to.
Summer Report Xi He Golisano College of Computing and Information Sciences Rochester Institute of Technology Rochester, NY
Uncovering Overlap Community Structure in Complex Networks using Particle Competition Fabricio A. Liang
Combining Methods to Stabilize and Increase Performance of Neural Network-Based Classifiers Fabricio A. Moacir P. Ponti
ppt © NTU-IM-OPLab. (Chien-Cheng Huang) 1 / 45 A General Neural Network Model for Estimating Telecommunications Network Reliability Instructor.
Particle Competition and Cooperation in Networks for Semi-Supervised Learning with Concept Drift Fabricio Breve 1,2 Liang Zhao 2
Artificial Intelligence Chapter 3 Neural Networks Artificial Intelligence Chapter 3 Neural Networks Biointelligence Lab School of Computer Sci. & Eng.
Bounded relay hop mobile data gathering in wireless sensor networks
UEP LT Codes with Intermediate Feedback Jesper H. Sørensen, Petar Popovski, and Jan Østergaard Aalborg University, Denmark IEEE COMMUNICATIONS LETTERS,
Dynamical Analysis of LVQ type algorithms, WSOM 2005 Dynamical analysis of LVQ type learning rules Rijksuniversiteit Groningen Mathematics and Computing.
Behavioral Fault Model for Neural Networks A. Ahmadi, S. M. Fakhraie, and C. Lucas Silicon Intelligence and VLSI Signal Processing Laboratory, School of.
Bing LiuCS Department, UIC1 Chapter 8: Semi-supervised learning.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Authors :
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Query Rules Study on Active Semi-Supervised Learning using Particle Competition and Cooperation Fabricio Department of Statistics,
Zijian Wang, Eyuphan Bulut, and Boleslaw K. Szymanski Center for Pervasive Computing and Networking and Department of Computer Science Rensselaer Polytechnic.
Introduction to Machine Learning August, 2014 Vũ Việt Vũ Computer Engineering Division, Electronics Faculty Thai Nguyen University of Technology.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Neural Network Architecture Session 2
Applying Deep Neural Network to Enhance EMPI Searching
Neural-Network Combination for Noisy Data Classification
Pekka Laitila, Kai Virtanen
QianZhu, Liang Chen and Gagan Agrawal
Akbar Akbari Esfahani1, Theodor Asch2
COMBINED UNSUPERVISED AND SEMI-SUPERVISED LEARNING FOR DATA CLASSIFICATION Fabricio Aparecido Breve, Daniel Carlos Guimarães Pedronette State University.
Prof. Carolina Ruiz Department of Computer Science
A Hybrid PCA-LDA Model for Dimension Reduction Nan Zhao1, Washington Mio2 and Xiuwen Liu1 1Department of Computer Science, 2Department of Mathematics Florida.
Face Recognition and Detection Using Eigenfaces
Jianping Fan Dept of Computer Science UNC-Charlotte
Artificial Intelligence Chapter 3 Neural Networks
network of simple neuron-like computing elements
Fabricio Breve Department of Electrical & Computer Engineering
Problem definition Given a data set X = {x1, x2, …, xn} Rm and a label set L = {1, 2, … , c}, some samples xi are labeled as yi  L and some are unlabeled.
Artificial Intelligence Chapter 3 Neural Networks
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Market-based Dynamic Task Allocation in Mobile Surveillance Systems
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
CHAPTER I. of EVOLUTIONARY ROBOTICS Stefano Nolfi and Dario Floreano
Artificial Intelligence Chapter 3 Neural Networks
Zhedong Zheng, Liang Zheng and Yi Yang
Artificial Intelligence Chapter 3 Neural Networks
Three steps are separately conducted
MAS 622J Course Project Classification of Affective States - GP Semi-Supervised Learning, SVM and kNN Hyungil Ahn
CAMCOS Report Day December 9th, 2015 San Jose State University
Artificial Intelligence Chapter 3 Neural Networks
Akram Bitar and Larry Manevitz Department of Computer Science
Machine Learning.
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

IEEE World Conference on Computational Intelligence – WCCI 2010 International Joint Conference on Neural Networks – IJCNN 2010 Semi-Supervised Learning from Imperfect Data through Particle Cooperation and Competition Fabricio A. Breve¹ fabricio@icmc.usp.br Liang Zhao¹ zhao@icmc.usp.br Marcos G. Quiles2 quiles@unifesp.br ¹ Department of Computer Science, Institute of Mathematics and Computer Science (ICMC), University of São Paulo (USP), São Carlos, SP, Brazil ² Department of Science and Technology (DCT), Federal University of São Paulo (Unifesp), São José dos Campos, SP, Brazil

Outline Learning from Imperfect Data Proposed Method Nodes and Particles Dynamics Random-Deterministic Walk Computer Simulations Conclusions

Learning from Imperfect Data In Supervised Learning Quality of the training data is very important Most algorithms assume that the input label information is completely reliable In practice mislabeled samples are common in data sets.

Learning from Imperfect Data [33] P. Hartono and S. Hashimoto, “Learning from imperfect data,” Appl. Soft Comput., vol. 7, no. 1, pp. 353–363, 2007. [34] D. K. Slonim, “Learning from imperfect data in theory and practice,” Cambridge, MA, USA, Tech. Rep., 1996. [35] T. Krishnan, “Efficiency of learning with imperfect supervision,” Pattern Recogn., vol. 21, no. 2, pp. 183–188, 1988. [36] M.-R. Amini and P. Gallinari, “Semi-supervised learning with an imperfect supervisor,” Knowl. Inf. Syst., vol. 8, no. 4, pp. 385–413, 2005. [37] ——, “Semi-supervised learning with explicit misclassification modeling,” in IJCAI’03: Proceedings of the 18th international joint conference on Artificial intelligence. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc., 2003, pp. 555–560. In Semi-Supervised learning Problem is more critical Small subset of labeled data Errors are easier to be propagated to a large portion of the data set Besides its importance and vast influence on classification, it gets little attention from researchers

Proposed Method Particles competition and cooperation in networks Cooperation among particles from the same team (label / class) Competition for possession of nodes of the network Each team of particles… Tries to dominate as many nodes as possible in a cooperative way Prevents intrusion of particles from other teams

Initial Configuration A particle is generated for each labeled node of the network The node will be called that particle’s home node Particles initial position are set to their respective home nodes. Particles with same label play for the same team Nodes have a domination vector Labeled nodes have ownership set to their respective teams. Unlabeled nodes have levels set equally for each team Ex: [ 1 0 0 0 ] (4 classes, node labeled as class A) Ex: [ 0.25 0.25 0.25 0.25 ] (4 classes, unlabeled node)

Node Dynamics When a particle selects a neighbor to visit: It decreases the domination level of the other teams It increases the domination level of its own team Exception: Labeled nodes domination levels are fixed t t+1

Particle Dynamics A particle gets: stronger when it selects a node being dominated by its team weaker when it selects node dominated by other teams 0.6 0.4 0.3 0.2 0.2 0.1 0.1 0.1

Random-Deterministic Walk Random walk The particle randomly chooses any neighbor to visit with no concern about domination levels or distance Deterministic walk The particle will prefer visiting nodes that its team already dominates and nodes that are closer to their home nodes The particles must exhibit both movements in order to achieve an equilibrium between exploratory and defensive behavior

Deterministic Moving Probabilities Random Moving Probabilities 0.6 0.2 0.1 0.1 v2 v2 v3 0.4 0.3 0.2 0.1 Random Moving Probabilities v1 v3 v4 v2 0.8 0.1 0.1 0.0 v4 v3

Computer Simulations Network are generated with: Different sizes and mixtures Elements divided into 4 classes Set of nodes N Labeled subset L  N Mislabeled subset Q  L  N

Conclusions Biologically inspired method for semi-supervised classification Cooperation and competition among particles Natural way of preventing error propagation from mislabeled nodes Analysis of the results Critical points in the performance curve as the mislabeled samples subset grows

Future Work Different types of networks Real-world data-sets Comparison with other methods

Acknowledgements This work was supported by the State of São Paulo Research Foundation (FAPESP) and the Brazilian National Council of Technological and Scientific Development (CNPq)

IEEE World Conference on Computational Intelligence – WCCI 2010 International Joint Conference on Neural Networks – IJCNN 2010 Semi-Supervised Learning from Imperfect Data through Particle Cooperation and Competition Fabricio A. Breve¹ fabricio@icmc.usp.br Liang Zhao¹ zhao@icmc.usp.br Marcos G. Quiles2 quiles@unifesp.br ¹ Department of Computer Science, Institute of Mathematics and Computer Science (ICMC), University of São Paulo (USP), São Carlos, SP, Brazil ² Department of Science and Technology (DCT), Federal University of São Paulo (Unifesp), São José dos Campos, SP, Brazil