Color Clustering and Learning for Image Segmentation Based on Neural Networks Guo Dong, Member, IEEE, and Ming Xie, Member, IEEE 2009.02.23 최지혜.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Memristor in Learning Neural Networks
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Organizing a spectral image database by using Self-Organizing Maps Research Seminar Oili Kohonen.
Training and Testing Neural Networks 서울대학교 산업공학과 생산정보시스템연구실 이상진.
Self Organization of a Massive Document Collection
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Texture Segmentation Based on Voting of Blocks, Bayesian Flooding and Region Merging C. Panagiotakis (1), I. Grinias (2) and G. Tziritas (3)
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
Segmentation Divide the image into segments. Each segment:
An Illustrative Example
A neural approach to extract foreground from human movement images S.Conforto, M.Schmid, A.Neri, T.D’Alessio Compute Method and Programs in Biomedicine.
Neural Network Homework Report: Clustering of the Self-Organizing Map Professor : Hahn-Ming Lee Student : Hsin-Chung Chen M IEEE TRANSACTIONS ON.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Neural Networks Lecture 17: Self-Organizing Maps
Lecture 09 Clustering-based Learning
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
Unsupervised Learning and Clustering k-means clustering Sum-of-Squared Errors Competitive Learning SOM Pre-processing and Post-processing techniques.
Mean-shift and its application for object tracking
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organizing Maps (SOM) Unsupervised Learning.
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
5.5 Learning algorithms. Neural Network inherits their flexibility and computational power from their natural ability to adjust the changing environments.
Self-organizing map Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Pasi Fränti Clustering Methods: Part.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Building local part models for category-level recognition C. Schmid, INRIA Grenoble Joint work with G. Dorko, S. Lazebnik, J. Ponce.
A two-stage approach for multi- objective decision making with applications to system reliability optimization Zhaojun Li, Haitao Liao, David W. Coit Reliability.
NEURAL NETWORKS FOR DATA MINING
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A Comparison of SOM Based Document Categorization Systems.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Color Image Segmentation Speaker: Deng Huipeng 25th Oct , 2007.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Chap 7 Image Segmentation. Edge-Based Segmentation The edge information is used to determine boundaries of objects Pixel-based direct classification methods.
381 Self Organization Map Learning without Examples.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
Digital Image Processing
Self-Organizing Maps (SOM) (§ 5.5)
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Color Image Segmentation Mentor : Dr. Rajeev Srivastava Students: Achit Kumar Ojha Aseem Kumar Akshay Tyagi.
May 2003 SUT Color image segmentation – an innovative approach Amin Fazel May 2003 Sharif University of Technology Course Presentation base on a paper.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
Unsupervised Classification
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
Self Organizing Maps: Parametrization of Parton Distribution Functions
DIGITAL SIGNAL PROCESSING
Lecture 22 Clustering (3).
The Naïve Bayes (NB) Classifier
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Random Neural Network Texture Model
Presentation transcript:

Color Clustering and Learning for Image Segmentation Based on Neural Networks Guo Dong, Member, IEEE, and Ming Xie, Member, IEEE 최지혜

Topic 신경망을 기초로한 영상분할을 위한 컬 러 클러스터링과 학습 주제어 : SOM ( 능동적으로 맵을 구성 ) SA ( 전역 최적 클러스터링 위해 ) HPL( 계층적 초기학습 )

요약 정확한 색상차이를 측정하기 위하여 수정 된 컬러공간 L*u*v* 를 사용하였다. 분할 시스템은 supervised/unsupervised segmentation 으로 이루어 졌는데, un- 의 경우 컬러감소와 컬러 클러스터링을 목 표로 할 수 있다. 컬러 클러스터링은 SOM-SA 의 장점들로 취해 저 연산비용으 로 최적의 분할을 이룰 수 있다. HPL 은 색 상 프로토타입들을 목적색상을 위해 좋은 근사처리로 공급된다.

Image segmentation system based on neural networks 미리 알기 어려운 이미지 오브젝트의 컬러를 알고 있는 경우

Unsupervised segmentation 미리 알기 어려운 이미지 Spatial Compactness Color homogeneity Desirable properties Image-domain Feature-space Segmentation techniques

Unsupervised segmentation Watershed transform Self-organizing map Splitting and merging phases SOM

SOM is trained to generate the primitive clustering dominant colors of the image

Description of Problem To ensure a proper measure of color differences, image colors must be represented in a uniform color space. In unsupervised segmentation, color reduction is indispensable to the segmentation of a large color image. In supervised segmentation, color learning is crucial to build up an accurate classifier for the segmentation of the object of interest.

Flow of This paper Appropriate color space color reduction is performed by SOM learning SA seeks the optimal clusters from SOM prototypes New procedure of supervised learning

Appropriate color space L* is luminant component u* and v* are color components : u* axis varies from green to red v* axis changes from blue to yellow

RGB to L*u*v*

Color reduction : SOM learning a two-layer neural network with a rectangular topology Three inputs are fully connected to the neurons on a 2-D plane. Each neuron is a cell containing a weight values.

SOM Training Initialization – 16x16 rectangular neighborhood type is Gaussian weight vector –randomly initialize radius r = 16, 5 learning rate = 0.05,0.02 Input - each color point Competitive Process – ‘wining neuron’

SOM Training Cooperative Process- The topological neighbors are determined by Gaussian function centered at Adaptive Process - The weights of “winning neuron” and its neighbor neurons are updated within the neighborhood : Effective scope : neighborhood function

Sammon mapping of 16x16 weight vectors after SOM training

SA seeks the optimal clusters from SOM prototypes Simulated annealing 은 커다란 탐색공간에서 주어진 함수의 전역 최적점에 대한 훌륭한 근사치를 찾으려고 하는 전역 최적화 문제에 대한 일반적인 확률적 휴리스틱 접근방법 고체의 물리적인 담금질과 아주 많은 경우의 수를 가진 조합최적화문 제사이의 밀접한 관계 -> 여러 다른 신경망의 학습과정을 변화시켜줄 수 있다. Simulated annealing 은 커다란 탐색공간에서 주어진 함수의 전역 최적점에 대한 훌륭한 근사치를 찾으려고 하는 전역 최적화 문제에 대한 일반적인 확률적 휴리스틱 접근방법 고체의 물리적인 담금질과 아주 많은 경우의 수를 가진 조합최적화문 제사이의 밀접한 관계 -> 여러 다른 신경망의 학습과정을 변화시켜줄 수 있다. 학습한다 :minimization 과정으로 볼 수 있으며, 이것은 energy function 이나 error function 에서 downward 방향으로 간다. Initial weight 잘못 선택 시 Local minimum SA 의 개념의 도입

SA The optimal solution is obtained by consisting in randomly perturbing the system, and gradually decreasing the randomness to a low final level. cluster centers be The criterion of sum-of-squared-error The procedure of SA clustering is to search the appropriate cluster centers = minimize the energy function

SA Clustering

Clustering

Segmentation result by SOM-SA color clustering

RCE neural network is a supervised pattern classifier used for the estimation of feature region hyperspherical window New procedure of supervised learning

Drawback of RCE learning requirement of a complete sample set for all classes it requires the samples of both the object and the image background. – to segment the object of interest from the image background

Hierarchical Prototype Learning In some regions, a small size of prototype is appropriate, Other regions, a large size of prototype is more suitable. >> The proper way of region estimation is to estimate the region by the different sizes of color prototypes.

Hierarchical Prototype Learning

(a)Original color image. (b) SOM color clustering. Q = (c) SA clustering. Q = (d) SOM-SA color clustering. Q = (e) CL-SA color clustering. Q = EXPERIMENTAL EVALUATIONS

Supervised segmentation (a) Original gesture image. (b) HPL learning. (c) Color threshold. (d) Color histogram. (a) Original hand gesture images. (b) Segmentation of hand gestures.