Abstract This poster presents results of three studies dealing with application of ARTMAP neural networks for classification of remotely sensed multispectral.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Université du Québec École de technologie supérieure Face Recognition in Video Using What- and-Where Fusion Neural Network Mamoudou Barry and Eric Granger.
Marković Miljan 3139/2011
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Other Classification Techniques 1.Nearest Neighbor Classifiers 2.Support Vector Machines.
Neural Approach for Personalized Emotional Model in Human-Robot Interaction Mária Virčíková, Martin Pala, Peter Smolar, Peter Sincak Technical University.
CMPUT 466/551 Principal Source: CMU
An Approach to Evaluate Data Trustworthiness Based on Data Provenance Department of Computer Science Purdue University.
Multiple Criteria for Evaluating Land Cover Classification Algorithms Summary of a paper by R.S. DeFries and Jonathan Cheung-Wai Chan April, 2000 Remote.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Logistic Regression Rong Jin. Logistic Regression Model  In Gaussian generative model:  Generalize the ratio to a linear model Parameters: w and c.
Announcements  Project proposal is due on 03/11  Three seminars this Friday (EB 3105) Dealing with Indefinite Representations in Pattern Recognition.
1 Abstract This study presents an analysis of two modified fuzzy ARTMAP neural networks. The modifications are first introduced mathematically. Then, the.
Introduction. Evaluation of ARTMAP classifiers - Dependence on cluster representation Large number of classification algorithms available without detailed.
Artificial Neural Networks (ANNs)
Data Mining: A Closer Look Chapter Data Mining Strategies (p35) Moh!
Potential of Ant Colony Optimization to Satellite Image Classification Raj P. Divakaran.
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
Classification of Remotely Sensed Data General Classification Concepts Unsupervised Classifications.
AN ANALYSIS OF SINGLE- LAYER NETWORKS IN UNSUPERVISED FEATURE LEARNING [1] Yani Chen 10/14/
1 Computer Systems & Architecture Lesson 1 1. The Architecture Business Cycle.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
Radial Basis Function Networks
Remote Sensing Laboratory Dept. of Information Engineering and Computer Science University of Trento Via Sommarive, 14, I Povo, Trento, Italy Remote.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Ensembles of Classifiers Evgueni Smirnov
Accuracy Assessment. 2 Because it is not practical to test every pixel in the classification image, a representative sample of reference points in the.
CSCI 347 / CS 4206: Data Mining Module 06: Evaluation Topic 01: Training, Testing, and Tuning Datasets.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
The YES Program Technical Writing Guidelines & Tips Michael Georgiopoulos Michael Georgiopoulos.
Issues with Data Mining
PPT 206 Instrumentation, Measurement and Control SEM 2 (2012/2013) Dr. Hayder Kh. Q. Ali 1.
Soft Computing Lecture 20 Review of HIS Combined Numerical and Linguistic Knowledge Representation and Its Application to Medical Diagnosis.
Processing of large document collections Part 2 (Text categorization) Helena Ahonen-Myka Spring 2006.
Biostatistics Case Studies 2005 Peter D. Christenson Biostatistician Session 5: Classification Trees: An Alternative to Logistic.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Boris Babenko Department of Computer Science and Engineering University of California, San Diego Semi-supervised and Unsupervised Feature Scaling.
Hierarchical Distributed Genetic Algorithm for Image Segmentation Hanchuan Peng, Fuhui Long*, Zheru Chi, and Wanshi Siu {fhlong, phc,
NEURAL NETWORKS FOR DATA MINING
Intelligent Vision Systems ENT 496 Object Shape Identification and Representation Hema C.R. Lecture 7.
Image Classification 영상분류
 2003, G.Tecuci, Learning Agents Laboratory 1 Learning Agents Laboratory Computer Science Department George Mason University Prof. Gheorghe Tecuci 5.
Exploiting Context Analysis for Combining Multiple Entity Resolution Systems -Ramu Bandaru Zhaoqi Chen Dmitri V.kalashnikov Sharad Mehrotra.
1 E. Fatemizadeh Statistical Pattern Recognition.
Map of the Great Divide Basin, Wyoming, created using a neural network and used to find likely fossil beds See:
EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Digital Image Processing
Error & Uncertainty: II CE / ENVE 424/524. Handling Error Methods for measuring and visualizing error and uncertainty vary for nominal/ordinal and interval/ratio.
1 An Anti-Spam filter based on Adaptive Neural Networks Alexandru Catalin Cosoi Researcher / BitDefender AntiSpam Laboratory
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
FACE DETECTION : AMIT BHAMARE. WHAT IS FACE DETECTION ? Face detection is computer based technology which detect the face in digital image. Trivial task.
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Irfan Ullah Department of Information and Communication Engineering Myongji university, Yongin, South Korea Copyright © solarlits.com.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
The article written by Boyarshinova Vera Scientific adviser: Eltyshev Denis THE USE OF NEURO-FUZZY MODELS FOR INTEGRATED ASSESSMENT OF THE CONDITIONS OF.
Unsupervised Classification
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
Gaussian Mixture Model classification of Multi-Color Fluorescence In Situ Hybridization (M-FISH) Images Amin Fazel 2006 Department of Computer Science.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Chapter 12 Case Studies Part B. Control System Design.
Unsupervised Learning Part 2. Topics How to determine the K in K-means? Hierarchical clustering Soft clustering with Gaussian mixture models Expectation-Maximization.
Big data classification using neural network
Deep Learning Amin Sobhani.
Classification of Remotely Sensed Data
Map of the Great Divide Basin, Wyoming, created using a neural network and used to find likely fossil beds See:
Data Mining Practical Machine Learning Tools and Techniques
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Supervised machine learning: creating a model
Presentation transcript:

Abstract This poster presents results of three studies dealing with application of ARTMAP neural networks for classification of remotely sensed multispectral images. 1. Comparison of performance of ARTMAP classifiers with different types of cluster representation. In this study, the best results were obtained using Extended Gaussian ARTMAP. 2. A method for computation of classification accuracy index for Gaussian ARTMAP neural network. This method can be used to generate maps containing only pixels with prescribed minimum accuracy of classification. Methods for analysis and enhancement of neural network classification of remotely sensed images Norbert Kopčo, Peter Sinčák, and Rudolf Jakša Department of Cognitive and Neural Szstems, Boston University Computational Intelligence Group, Technical University of Košice, Slovakia

3. Preliminary results obtained on a hierarchical ARTMAP architecture based on set of dichotomical classifiers. This neural network is suitable for parallel processing of large data sets. Refrences Sinčák, P., Veregin, H., and Kopčo, N.: Computational intelligence for classification of remotely sensed images. Neural Network World, v. 5, 1998, pp Sinčák, P., Kopčo, N., and Veregin H. (unpublished) Conflation Techniques to Improve Image Classification Accuracy. Submitted to Photogrametric Engineering and Remote Sensing. Carpenter, G. and Grossberg, S. (1992) Fuzzy ARTMAP: A Neural Network Architecture for Incremental Supervised Learning of Analog Multidimensional Maps. IEEE Trans. on Neural Networks, 3: Williamson, J.R. (1996) Gaussian ARTMAP: A Neural Network for Fast Incremental Learning of Noisy Multidimensional Maps, Neural Networks, pp Cunningham, R. K. (1998) Learning and recognizing patterns of visual motion, color, and form. Unpublished Ph.D. thesis. Boston University.

1. Evaluation of dependence on cluster representation in ARTMAP classifiers There is a large number of classification algorithms available without detailed knowledge of their properties/performance. Neural networks are considered to be assumption-less classifiers (for a successful classification they do not require explicit assumptions about the data distribution). But these systems have implicit assumptions built into them. These assumptions are related to the data representation and algorithmic properties of the system. In the present study, the dependence of the ARTMAP classifiers performance on the internal cluster representation is analyzed for image data from remote sensing.

Data-set Seven-dimensional Landsat TM image of the city of Košice (Figure 1) Size of image: 368,125 pixels, out of which 6,331 classified by an expert into seven categories (A - urban area, B - barren fields, C - bushes, D - agricultural fields, E - Meadows, F - Woods, and G - water) Method of analysis The performance is compared in terms of weighted PCC (Percent of Correctly Classified) and the contingency tables. Compared systems Fuzzy ARTMAP - FA (Carpenter et al., 1992) Gaussian ARTMAP - GA (Williamson, 1996) Extended Gaussian ARTMAP - EGA (Cunningham, 1997)

Figure 1: Original image

ARTMAP classifier topology Input OutputMapField (Labeling) Recognition (Clustering) Comparison Cluster representation: Fuzzy ARTMAP: Hyper-rectangles Gaussian ARTMAP: Gaussian distributions without covariance Extended Gaussian ARTMAP: Gaussian distribution with covariance F0F1F2MFOL FA EGA GA

Results Weighted PCC for five permutations of the training set and voting Gaussian distribution more suitable as representation of the clusters in image classification tasks (see Fig. 2) Best performance achieved by Extended Gaussian ARTMAP (although differences not very significant) Contingency table for Extended Gaussian ARTMAP Predicted class

Figure 2: Image classified by Extended Gaussian ARTMAP Sensitivity to the ordering of the training set smaller for GA and EGA than for the FA During learning, FA always reached 100% accuracy. GA/EGA accuracy was around 97.7%, which means stronger generalization.

2. Confidence Index In remote sensing, a large amount of data is produced, which needs to be quickly and reliably classified. Often, explicit assessment of the reliability is needed. Goals: Develop a method for simple and fast assessment of the accuracy of classification. For the accuracy assessment, exploit computations done during the classification process. Develop the assessment method for the Gaussian ARTMAP algorithm, which achieved the best performance in the previous study.

Method GA uses Bayes discrimination function: (1) The pattern is classified into the category with the largest probability measure: (2) If voting is used, the probability measures are evaluated over all the networks: (3) The confidence index is then defined as: (4)

Figure 3: Confidence map for Gaussian ARTMAP classification (expressed in %)

Confidence threshold A confidence threshold can be defined, which, in combination with the confidence index, can be used to generate maps with arbitrary accuracy. Two counteracting aspects of classification are influenced by the choice of threshold: 1) accuracy of classification, and 2) number of unclassified patterns (see graph below)

Figure 4: Thresholded classification map for Gaussian ARTMAP (accuracy 99%,  =0.921)

Results and discussion The presented method offers a simple and fast way for assessment of quality of the Gaussian ARTMAP neural network classification. The method can be used also with other neural networks employing the voting strategy. The method offers a tool for production of maps with arbitrary classification accuracy. Because of the way in which this method exploits the computations done during the classification, it is possible that the confidence assigned to some of the classified points is incorrect.

Often, the complexity of the data is too high, or the size of the data set is too large (in terms of memory or time requirements), for standard versions of classifiers. To overcome this problem, hierarchical/modular classification systems are usually applied. Here, a hierarchical structure based on ARTMAP networks is proposed, and its properties are analyzed. The system is called Parallel ARTMAP, because it consists of ART sub-nets, each trained independently on a single category (i.e., each sub-net learns to detect data from one category). In the following simulations, fuzzy ART/ARTMAP system is used in the sub-net modules of the network. 3. Hierarchical classifier

Parallel processing system Every sub-net in the clustering layer is trained on data belonging to a single class, i.e., it learns to detect data from that class. Conflict resolution module serves to resolve conflicts if two or more clustering nets identify a pattern as “theirs”. Two design choices: 1) rule for determination of optimal  for each clustering subnet, 2) conflict-resolving rule. ART Parallel ARTMAP system Clustering subsystem (class detectors) Conflict resolution Class 1 data Class 2 data Class 3 data Class n data ART

Optimal  determination rule Each sub-net is trained on patterns from a single class using cross-validation: - 9/10 - estimation set, - 1/10 - validation set. Training is repeated for different values of . Rule: Choose  which corresponds to the the point just before the first dip in the “true positive” graph. This rule assures that each detector will correctly identify almost all of the patterns belonging to it, while minimizing the false alarm rate Graph of performance of clustering sub-net for class #7 Parameter  Percent of correctly classified Other classes True “–” rate My class (#7) True “+” rate Optimal 

Class 1  PCC Class 2  PCC Class 3  PCC Class 4  PCC Class 5  PCC Class 6  PCC Class 7  PCC Optimal  for all sub-nets Other classes True negative rate My class True positive rate Optimal 

For the previously defined  -determination rule, all the sub-nets will have almost 100% true positive rate. But their false alarm rate (false negative rate) will be non-zero. So the nets are biased towards identifying patterns as “theirs”. Moreover, the false alarm rate will be different for each sub-net. So the conflict resolving rule can be based on the true negative rate. Conflict-resolving rule Rule: If there is a conflict between two or more sub-nets, assign the pattern to the class with the largest true negative rate. This rule was chosen, because the parameters needed for conflict resolution (true “-” rates) are easily computed during the  -determination process.

Results and discussion Overall classification accuracy is 78.89%. The decreased performance is caused mainly by the misclassification of the patterns from classes C and F. Despite worse performance, the system can be usefully applied to large data sets. Also the speed of the system can be advantageous. Choice of alternative rules for  -determination and conflict resolution can improve the performance. Predicted class Contingency table for Parallel ARTMAP