Random Neural Network Texture Model

Slides:



Advertisements
Similar presentations
Context-based object-class recognition and retrieval by generalized correlograms by J. Amores, N. Sebe and P. Radeva Discussion led by Qi An Duke University.
Advertisements

Face Recognition: A Convolutional Neural Network Approach
Perceptron Learning Rule
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
B.Macukow 1 Lecture 12 Neural Networks. B.Macukow 2 Neural Networks for Matrix Algebra Problems.
Texture Turk, 91.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Texture Classification Based on Co-occurrence Matrices Presentation III Pattern Recognition Mohammed Jirari Spring 2003.
Abstract Extracting a matte by previous approaches require the input image to be pre-segmented into three regions (trimap). This pre-segmentation based.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
Relevance Feedback Content-Based Image Retrieval Using Query Distribution Estimation Based on Maximum Entropy Principle Irwin King and Zhong Jin The Chinese.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
1 Template-Based Classification Method for Chinese Character Recognition Presenter: Tienwei Tsai Department of Informaiton Management, Chihlee Institute.
December 5, 2012Introduction to Artificial Intelligence Lecture 20: Neural Network Application Design III 1 Example I: Predicting the Weather Since the.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
We introduce the use of Confidence c as a weighted vote for the voting machine to avoid low confidence Result r of individual expert from affecting the.
Table 3:Yale Result Table 2:ORL Result Introduction System Architecture The Approach and Experimental Results A Face Processing System Based on Committee.
A Markov Random Field Model for Term Dependencies Donald Metzler W. Bruce Croft Present by Chia-Hao Lee.
Object Stereo- Joint Stereo Matching and Object Segmentation Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on Michael Bleyer Vienna.
Artificial Intelligence Techniques Multilayer Perceptrons.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
Generalized Fuzzy Clustering Model with Fuzzy C-Means Hong Jiang Computer Science and Engineering, University of South Carolina, Columbia, SC 29208, US.
Handwritten Recognition with Neural Network Chatklaw Jareanpon, Olarik Surinta Mahasarakham University.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Levels of Image Data Representation 4.2. Traditional Image Data Structures 4.3. Hierarchical Data Structures Chapter 4 – Data structures for.
Chap 7 Image Segmentation. Edge-Based Segmentation The edge information is used to determine boundaries of objects Pixel-based direct classification methods.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
GENDER AND AGE RECOGNITION FOR VIDEO ANALYTICS SOLUTION PRESENTED BY: SUBHASH REDDY JOLAPURAM.
Colour and Texture. Extract 3-D information Using Vision Extract 3-D information for performing certain tasks such as manipulation, navigation, and recognition.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Spectrum Reconstruction of Atmospheric Neutrinos with Unfolding Techniques Juande Zornoza UW Madison.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Surface Defect Inspection: an Artificial Immune Approach Dr. Hong Zheng and Dr. Saeid Nahavandi School of Engineering and Technology.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Another Example: Circle Detection
By Brian Lam and Vic Ciesielski RMIT University
Neural Network Architecture Session 2
An Image Database Retrieval Scheme Based Upon Multivariate Analysis and Data Mining Presented by C.C. Chang Dept. of Computer Science and Information.
Summary of “Efficient Deep Learning for Stereo Matching”
Classification with Perceptrons Reading:
Final Year Project Presentation --- Magic Paint Face
School of Electrical and
By: Kevin Yu Ph.D. in Computer Engineering
Chapter 3. Artificial Neural Networks - Introduction -
Outline Neural networks - reviewed Texture modeling
of the Artificial Neural Networks.
network of simple neuron-like computing elements
Ying Dai Faculty of software and information science,
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
The Naïve Bayes (NB) Classifier
Emna Krichene 1, Youssef Masmoudi 1, Adel M
Introduction to Artificial Intelligence Lecture 24: Computer Vision IV
Artificial Neural Networks
A Block Based MAP Segmentation for Image Compression
Face Recognition: A Convolutional Neural Network Approach
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Introduction to Artificial Intelligence Lecture 22: Computer Vision II
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

Random Neural Network Texture Model Erol Gelenbe, Khaled Hussain, and Hossam Abdelbaki

A. Introduction B. Color Change Mechanism in Chameleon C. Texture Learning Algorithm D. Texture Generation Algorithm E. Experimental Evaluation F. Conclusions

A.Introduction There is no generally accepted definition of texture. Texture analysis is considered one of the most important subjects in image processing and computer vision. The task of extracting texture features is crucial and if one could model and quantify the process by which the human recognizes texture, one could construct a highly successful recognition system. Unfortunately, the process by which we recognize textures is not fully understood, and researchers are left to consider some alternative techniques.

A.Introduction Regarding the texture synthesis problem, Markov random fields (MRFs) have been used extensively because they are able to capture the local (spatial) contextual information in a texture and generate a similar texture using the extracted parameters. However, the operations performed during texture generation are very time consuming. The idea of using the neural networks in learning and regenerating textures was basically inspired from the color change mechanisms of some kinds of animals which are able to alter their colors and their color patterns so that they can be effectively camouflaged against a variety of different backgrounds.

A. Introduction In this presentation, we will introduce a novel method for texture modeling (learning) and synthesis using the random neural network model. This model has been successfully applied in generating synthetic textures that have features similar to those generated by the MRF model, such as granularity and inclination but with tremendous reduction in the generation time over the MRF.

B. Color Change Mechanism in Chameleon

B. Color Change Mechanism in Chameleon

C. Texture Learning Algorithm Here we will describe the procedure used for extracting the features from a given texture image, through training the RNN, and encoding those features into the weight matrices of the network. The resulting weights can then be used for generating textures that have similar characteristics to the initial texture.

C.Texture Learning Algorithm 1 - Initialize the weight matrices W+ and W- to random values between 0 and 1. 2- Set k and yk to the normalized pixel values in the window and set k to 0.0. 3- Solve the nonlinear system given be to obtain the actual neuron outputs q. 4- Adjust the network parameters to minimize the cost function Ek given by. 5- For each successive desired input-output pair, indexed by k, the n x n weight matrices must be adjusted after applying each input.

D.Texture Synthesis (Generation) Procedure The random neural network which we propose in order to generate artificial textures associates a neuron I(i, j) to each pixel (i, j) in the plane. The state f(i, j) can be interpreted as the gray level value of the pixel at (i, j). The topology of the proposed random network for texture generation is shown below.

D.Texture Synthesis (Generation) Procedure We see that each neuron in the network will be, in general, connected to at most eight neighbors. We shall use the for the positions in the network, where x denotes any (i, j). For instance, X1 denotes the (i-1, j+1) pixel.

D.Texture Generation Algorithm 1- Specify the values of the weights 2- Generate at random a bitmap value between 0 and 1 for each pixel x = (i,j) and assign it to the variable q(x) for each x. 3- Start with an image that is generated by coloring each point with level l, where l is chosen with equal probability from the set 0, 1, 2, …., G-1, where G is the number of gray levels. 4- Start with k=0 up to k=K (the stopping condition), iterate on equations and compute It should be stated here that the network weights can be chosen according to some criteria to generate synthetic textures with predefined characteristics or they can result from training the RNN to a certain texture image.

Modeling Synthetic Textures In this simulations, we begin by assuming specific weights for the RNN and generate synthetic texture. The texture learning procedure is then applied to the generated synthetic texture image, with 3 x 3 training window and the RNN weights are obtained. The weights of the RNN are used in generating another synthetic texture (again using the generation procedure). The original and final textures are then compared. Although all our experiments yield visually similar realizations, as shown in the figures, we calculate some of the statistical features which are derived from the co-occurrence matrix such as energy, contrast, entropy and homogeneity of the textures.

Synthetic binary textures generated with specified (left) and estimated (right) and parameters

Synthetic Gray Textures Generated with Specified (left) and Estimated (right) and Parameters

Natural (left) and Synthetic (right) Textures b c d

Co-occurrence Matrix Given the following 4 x 4 image containing that contains 3 different gray levels The 3x3 gray level co-occurrence matrix for a displacement vector d = (dx, dy) = (1,0), is given by

Co-occurrence Matrix Features Basic co-occurrence matrix statistical feature

Co-occurrence Matrix Features

Binary synthetic texture (iterations 1, 3, 6, and 9) Binary synthetic texture after training the RNN (iterations 1, 3, 6, and 9)

Binary synthetic texture after training (iterations 1, 3, 6, and 9)

Binary synthetic texture after training (iterations 1, 3, 6, and 9)

Conclusions