Organizing a spectral image database by using Self-Organizing Maps Research Seminar 7.10.2005 Oili Kohonen.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
1 Machine Learning: Lecture 10 Unsupervised Learning (Based on Chapter 9 of Nilsson, N., Introduction to Machine Learning, 1996)
November 18, 2010Neural Networks Lecture 18: Applications of SOMs 1 Assignment #3 Question 2 Regarding your cascade correlation projects, here are a few.
Self Organization of a Massive Document Collection
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
Visual Recognition Tutorial
Fast and Compact Retrieval Methods in Computer Vision Part II A. Torralba, R. Fergus and Y. Weiss. Small Codes and Large Image Databases for Recognition.
UNIVERSITY OF JYVÄSKYLÄ Yevgeniy Ivanchenko Yevgeniy Ivanchenko University of Jyväskylä
Texture-Based Image Retrieval for Computerized Tomography Databases Winnie Tsang, Andrew Corboy, Ken Lee, Daniela Raicu and Jacob Furst.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Visual Querying By Color Perceptive Regions Alberto del Bimbo, M. Mugnaini, P. Pala, and F. Turco University of Florence, Italy Pattern Recognition, 1998.
A neural approach to extract foreground from human movement images S.Conforto, M.Schmid, A.Neri, T.D’Alessio Compute Method and Programs in Biomedicine.
Region Filling and Object Removal by Exemplar-Based Image Inpainting
1 An Empirical Study on Large-Scale Content-Based Image Retrieval Group Meeting Presented by Wyman
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
CS292 Computational Vision and Language Visual Features - Colour and Texture.
Visual Recognition Tutorial
Neural Networks based on Competition
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Neural Networks Lecture 17: Self-Organizing Maps
Lecture 09 Clustering-based Learning
Content Based Image Organization CS491 Spring 2006 Prof. Chengyu Sun Kelly Breed.
Image Processing David Kauchak cs458 Fall 2012 Empirical Evaluation of Dissimilarity Measures for Color and Texture Jan Puzicha, Joachim M. Buhmann, Yossi.
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Autonomous Learning of Object Models on Mobile Robots Xiang Li Ph.D. student supervised by Dr. Mohan Sridharan Stochastic Estimation and Autonomous Robotics.
Project reminder Deadline: Monday :00 Prepare 10 minutes long pesentation (in Czech/Slovak), which you’ll present on Wednesday during.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Student : Sheng-Hsuan Wang Department.
Self Organizing Maps (SOM) Unsupervised Learning.
Self Organized Map (SOM)
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
Self-organizing map Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Pasi Fränti Clustering Methods: Part.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern.
A two-stage approach for multi- objective decision making with applications to system reliability optimization Zhaojun Li, Haitao Liao, David W. Coit Reliability.
Stephen Marsland Ch. 9 Unsupervised Learning Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from Stephen.
Self-Organizing Maps Corby Ziesman March 21, 2007.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Technique for Searching Images in a Spectral Image Database Markku Hauta-Kasari, Kanae Miyazawa *, Jussi Parkkinen, and Timo Jaaskelainen University of.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Exploiting Data Topology in Visualization and Clustering.
Clustering C.Watters CS6403.
2005/12/021 Content-Based Image Retrieval Using Grey Relational Analysis Dept. of Computer Engineering Tatung University Presenter: Tienwei Tsai ( 蔡殿偉.
2005/12/021 Fast Image Retrieval Using Low Frequency DCT Coefficients Dept. of Computer Engineering Tatung University Presenter: Yo-Ping Huang ( 黃有評 )
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Externally growing self-organizing maps and its application to database visualization and exploration.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Rival-Model Penalized Self-Organizing Map Yiu-ming Cheung.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Adaptive FIR Neural Model for Centroid Learning in Self-Organizing.
Self-Organizing Maps (SOM) (§ 5.5)
Self-organizing maps applied to information retrieval of dissertations and theses from BDTD-UFPE Bruno Pinheiro Renato Correa
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Community self-Organizing Map and its Application to Data Extraction Presenter: Chun-Ping Wu Authors:
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Data Mining, Neural Network and Genetic Programming
Self-Organizing Maps for Content-Based Image Database Retrieval
Structure learning with deep autoencoders
Lecture 22 Clustering (3).
Outline Texture modeling - continued Julesz ensemble.
Self Organizing Maps A major principle of organization is the topographic map, i.e. groups of adjacent neurons process information from neighboring parts.
Artificial Neural Networks
Presentation transcript:

Organizing a spectral image database by using Self-Organizing Maps Research Seminar Oili Kohonen

Motivation? Image retrieval from conventional databases since 1990's... many efficient techniques have been developed However, efficient techniques for querying images from spectral image database does not exist. Due to the high amount of data in the case of spectral images, the efficient techniques will be needed.

Spectral imaging? Metameric imaging: cheap and practical way to achieve a color match. Spectral imaging: needed to achieve a color match for all observers across the changes in the illumination.

Principle of SOM: The Self-Organizing Map (SOM) algorithm: Is an unsupervised learning algorithm. Defines mapping from high-dimensional data into lower-dimensional data. SOM: Consists of arranged units (or neurons), which are represented by weight vectors. Units are connected to each other by neighborhood relation.

Principle of SOM: SOM Algorithm: begin Initialize the SOM for i = 1 : number of epochs take input vector x randomly from the training data; find the BMU for x; update the weight vectors of the map; decrease the learning rate & neighborhood function; end;

Principle of SOM: finding the BMU Mathematically the BMU is defined for input data vector, x, as follows: Euclidean distance is a typically used distance measure.

Principle of SOM: updating the weight vectors Learning rate: product of learning rate parameter & neighborhood function:

Principle of SOM: neighborhood function Neighborhood function h(t) has to fullfill the following two requirements: It has to be symmetric about the maximum point (BMU). It's amplitude has to decrease monotonically with an increasing distance from BMU. Gaussian function is a typical choice for h(t)

Principle of SOM: Lattice structure Lattice structures: hexagonal & rectangular

Searching Technique: Constructing histogram database Train SOM Find BMU for each pixel in an image Generate BMU-histogram & normalize it by the number of pixels in an image Repeat steps 2 & 3 for all images in a spectral image database Save histogram database with the information of SOM-map

Searching Technique: making a search Choose an image and generate its histogram. Calculate the distances between the generated histogram and the existing histogram database. Order images by these distances. The results of the search are shown to user as RGB-images

Searching techniques: One-dimensional SOM:

Searching techniques: Two-dimensional histogram-trained SOM

Distance Calculations: H1 & H2 are the compared histograms L1 & L2 are the indices of max. values| H3=(H1+H2)/2

Experiments: One-dimensional SOM for unweighted images One-dimensional SOM for images weighted by HVS-function Two-dimensional SOM From histogram data From spectral data Human Visual Sensitivity-function (Unweighted images) (Unweighted and weighted images)

The Used Database: 106 images: 61 components, spectral range from 400 nm to 700 nm at 5 nm interval.

Training of the SOMs: spectra were selected randomly from each image & epochs in ordering & fine tuning phases, respectively. Unit sizes: 50 – chosen empirically 49 – to have comparable results with 1D-SOM 14*14 map in the case of histogram-trained SOM

Results: 1d-SOM, Unweighted images Pure data Multiplied data The distance measure: Euclidean distance

Results: 1D, Unweighted images Energy K-L Peak DPD JD

Results: 1D, Weighted images Energy K-L Peak DPD JD

Conclusions I: The “structure” of the database is different for weighted and unweighted images. The “best” results were got by using euclidean distance and Jeffrey divergence. Importance of normalization?? * Better results with Euclidean distance & DPD * Worse results with Jeffrey divergence

Results: 2D, Unweighted spectral data Euclidean Energy K-L Peak DPD JD

Results: 2D, Weighted spectral data Euclidean Energy K-L Peak DPD JD

Conclusions II: In the case of two-dimensional SOM better results are achieved by using non-weighted images. When the weighted images are used, the use of 1D- SOM seems to be more reasonable.

Results: histogram-trained 2D-SOM Euclidean Energy K-L Peak DPD JD

Connections between images and histograms: non-weighted weighted

Past, Present & Future: Past: What you have seen so far... Present: Texture features in addition to color features Future: Testing the effect of different metrics in ordering and fine-tuning phases (during the training of SOM)

Questions: ? Thank you for not asking any... =)