Chapter 4 Chaos in the High-dimensional Nonlinear System.

Slides:



Advertisements
Similar presentations
Perceptron Lecture 4.
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Lect.3 Modeling in The Time Domain Basil Hamed
From
Can a Monkey with a Computer Create Art? J. C. Sprott Department of Physics University of Wisconsin - Madison Presented to the Society for Chaos Theory.
UNIVERSITY OF JYVÄSKYLÄ Building NeuroSearch – Intelligent Evolutionary Search Algorithm For Peer-to-Peer Environment Master’s Thesis by Joni Töyrylä
Artificial Neural Networks
Machine Learning Neural Networks
1 Introduction to Bio-Inspired Models During the last three decades, several efficient machine learning tools have been inspired in biology and nature:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
1 Class #27 Notes :60 Homework Is due before you leave Problem has been upgraded to extra-credit. Probs and are CORE PROBLEMS. Make sure.
Chapter 6: Multilayer Neural Networks
Admin stuff. Questionnaire Name Math courses taken so far General academic trend (major) General interests What about Chaos interests you the most?
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Airball Demo Modeling —— Dimensional Analysis Method Based on Genetic Algorithm for Key Parameters Identification Name : Zhisheng Team Advisor : Zhang.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Chapter 3 Chaos and Fractals in the Two- dimensional Non-linear Mapping.
Renormalization and chaos in the logistic map. Logistic map Many features of non-Hamiltonian chaos can be seen in this simple map (and other similar one.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Fundamentals of Data Analysis Lecture 9 Management of data sets and improving the precision of measurement.
Introduction to Artificial Neural Network Models Angshuman Saha Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg.
8 th Grade Math Common Core Standards. The Number System 8.NS Know that there are numbers that are not rational, and approximate them by rational numbers.
Nonlinear programming Unconstrained optimization techniques.
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Strange Attractors From Art to Science J. C. Sprott Department of Physics University of Wisconsin - Madison Presented to the Society for chaos theory in.
Introduction to Quantum Chaos
Ch 9.8: Chaos and Strange Attractors: The Lorenz Equations
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
The Application of The Improved Hybrid Ant Colony Algorithm in Vehicle Routing Optimization Problem International Conference on Future Computer and Communication,
Chiara Mocenni List of projects.
The Science of Complexity J. C. Sprott Department of Physics University of Wisconsin - Madison Presented to the First National Conference on Complexity.
A LONGITUDINAL EXAMINATION OF THE EMERGENCE OF A HEALTHY CHAOTIC WALKING PATTERN IN NORMAL INFANT DEVELOPMENT Harbourne, R.T. 1, Kurz, M. 2, and DeJong,
Some figures adapted from a 2004 Lecture by Larry Liebovitch, Ph.D. Chaos BIOL/CMSC 361: Emergence 1/29/08.
BCS547 Neural Decoding.
A Simple Chaotic Circuit Ken Kiers and Dory Schmidt Physics Department, Taylor University, 236 West Reade Ave., Upland, Indiana J.C. Sprott Department.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Nonlinear differential equation model for quantification of transcriptional regulation applied to microarray data of Saccharomyces cerevisiae Vu, T. T.,
Controlling Chaos Journal presentation by Vaibhav Madhok.
CHAPTER 2.3 PROBABILITY DISTRIBUTIONS. 2.3 GAUSSIAN OR NORMAL ERROR DISTRIBUTION  The Gaussian distribution is an approximation to the binomial distribution.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Chapter 6 Neural Network.
Application of Bifurcation Theory To Current Mode Controlled (CMC) DC-DC Converters.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
[Chaos in the Brain] Nonlinear dynamical analysis for neural signals Jaeseung Jeong, Ph.D Department of Bio and Brain Engineering, KAIST.
Cmpe 588- Modeling of Internet Emergence of Scale-Free Network with Chaotic Units Pulin Gong, Cees van Leeuwen by Oya Ünlü Instructor: Haluk Bingöl.
Chapter 7. Classification and Prediction
Deep Feedforward Networks
Artificial Neural Networks
Learning in Neural Networks
Artificial Intelligence (CS 370D)
Luís Filipe Martinsª, Fernando Netoª,b. 
Handout #21 Nonlinear Systems and Chaos Most important concepts
Strange Attractors From Art to Science
of the Artificial Neural Networks.
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
© Sharif University of Technology - CEDRA By: Professor Ali Meghdari
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Introduction to Neural Network
Localizing the Chaotic Strange Attractors of Multiparameter Nonlinear Dynamical Systems using Competitive Modes A Literary Analysis.
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

Chapter 4 Chaos in the High-dimensional Nonlinear System

CONTENT 4.1 Chaos in the Neural Network 4.2 Symmetric Generalized Lorenz Strange Attractors 4.3 Symmetric generalized Rossler strange attractors

4.1.1 The emergency and develop- ment of Neural Network Theory Artificial Neural Network has become an important aspect of scientific research in nonlinear. The research was started early in The study of Artificial Neural Network began in 1962, put forward by the Rosenblatt with perceptron model.

The renaissance of neural network research is in 1980s. Neurons link model, parallel and distributed processing seems to be the only way to the brand new development of computer. With the emergence of supercomputer, the study of Neural Networks Theory has made great achievements.

Neural network has been demonstrated potential capacity in a number of aspects. In 1986, it was discovered there is chaotic phenomenon in the brain. From 1920s, people began to study the neural network mix with chaos, that is to research chaos phenomenon in the brain.

4.1.2 Chaotic neural network research 1. What the role of Chaos in our brain? Babloyantz etc. think Chaos can enhance the resonance capacity of the brain so it can have a very extensive response to outside stimulates. Nicolis thinks chaos is the of self- reference logic generator.

Amit recognizes chaos not only won ’ t hamper the study to new models, but probably only if there is no chaos, it may only deepen the previous models but not study new models with memory. Tauda has proved that a chaotic neural network composed by low-dimensional chaotic dynamics has a strong ability to convey information from outside effectively, especially for time-varying external input information.

2. With the study of chaotic neural network, people have been trying to use it chaos to achieve some kind of functional: Ikeguchi etc. try to use the chaotic neural network on associative memory. Parodi et al used the chaotic behavior of the pseudo-random to create the noise model corresponds to the certain input mode, and to achieve information coding.

Nara et al using complex dynamics of non- symmetric recursive neural network for the search of complexity model, to prove that the complex orbit of the chaotic search based on the activation state space is more effective than random search, and the dynamic structure of chaotic orbit play an important role on the effectiveness of search behavior.

Yoshizawa studied the of non- monotonic and activity neural network, and shown that chaos can be used to eliminate the pseudo-memory state of network, when the network can not remember the memory rightly, it will play a chaotic behavior instead of giving the results of pseudo-memory.

3.The approach of generating chaos in neural network system: Up to now, it has been found that it can be generated from period-doubling bifurcation, homoclinic and heteroclinic orbit. Besides, the Rulle-Takens-Newhouse Road is also a usual way to generate chaos in neural network.

4.1.3 Physiological structure of neural network Neurons are basic units to compose neural networks. Although the diversity and micro-structure of neurons, but from the information processing point of view, it can be seen as a basic information processing unit, composed by the cell body, axons and dendrites. As shown in Fig. 4.1:

Neural network is characterized by prominent large-scale processing units and their mutual association, although the unit is simple, because of non-linear, its behavior can be very complicated, and there is the capacity of parallel and distributed processing.

From the dynamic point of view, the brain's neural network system is the form of a large number of neurons linked by a high degree of complex super-high-dimensional nonlinear system, the main functions of the network, the forward study and feedback associative memory can both be achieved by its dynamics subsystem (the state value dynamic systems and the weight dynamic subsystems).

These excellent functions of neural network can be achieved by the evolution of neural network dynamics. Artificial neural network is based on the study of physiology of the human brain with the aim of achieving some certain functions by simulating the mechanism of human brain.

4.1.4 Three-feedback neural network model This section will study the three-feedback neural network model which has the chaotic characteristics: namely, using the criterion of Lyapunov exponent to construct strange attractors of neural network structure, then analyzing its characteristics of moving path and calculating dim c of the strange attractor, and at last seeking the application based on the research.

The input layer containing D units of, the hidden layer with N units of, and output layer containing D units of. The transform function of hidden layer is, the output of network is a linear combination of hidden layer ’ s outputs, that is,  4.1   4.2 

In eq.(4.1), tanh is defined by:  4.3  With every iteration, each output is fed back to the corresponding input on it, it can be expressed as  4.4 

The strange attractor of neural network The network can output D infinite length of numerical sequence of, because , for this reason the author did not use instead of as independent variables ’ time series to construct attractor. Parameters are selected as follows: choose N=4, so that is the transverse and is the longitudinal coordinates, respectively. expresses the height (in the lower right corner of the map projection), is used to map a linear color palette with 16 kinds of colors; initially:

Attempts to replace the transform function tanh of hidden layer in the network with other functions such as Asymmetric Logistic Maps  4.5  To construct strange attractor by the method mentioned above. Figure 4.3 is a set of representative results:

(a)the transformation function of hidden layer is equation (8) (b) the transformation function of hidden layer is equation (8)

(c) the transformation function of hidden layer is equation (8) (d) the transformation function of hidden layer is equation (8)

(e) the transformation function of hidden layer is equation (10) (f) the transformation function of hidden layer is equation (10)

(g)the transformation function of hidden layer is equation (10) (h) the transformation function of hidden layer is equation (10)

4.1.5 Quantitative analysis of strange attractors of neural network According to the method put forward by Wolf et al, the author used FORTRAN programmed the calculation procedures to compute 1, to improve the computation speed of looking for the points to replace, the algorithm of quick sort is used. In order to verify the correctness of the procedure, the authors calculated 1 of the mapping Hnon: Mapping Hnon  4.6 

Table 4.1 shows the compute results of 1 of neural network ’ s strange attractors: Table of the strange attractor

Table 4.2 shows the D2 of dim c in mapping Hnon

Fig 4.4 ~ curve of mapping Hnon : 1 — m=4 ; 2 — m=5 。

Use network to iterative 90,000 times and obtain the attractor time series of abscissa, select 2000 data points, the author show the attractor in Figure 4.3 as Figure 4.5, the ~ curve of strange attractors. the results of D2 is shown in Table 4.3.

~ curve of the strange attractor of neural network (a)(b)

(c)(d)

(e)(f)

(g)(h) Fig. 4.5 ~ curve of the strange attractor of neural network

Table 4.3 dim c D 2 of Neural Network ’ s strange attractor

The size of fractal dimension reflects the extent of space, which is used by the fractal structure of the neural network strange attractors: The larger the dimension of strange attractor, the greater space it occupies, the more dense structure and more complex systems; In contrast, the more sparse structure, the more easy systems.

4.1.6 Conclusion In this section, the author studied the dynamic behavior of the artificial neural network which is chaotic, and results showed that: different from the conventional neural network with the only characteristics of gradient descent, the chaotic neural network has richer dynamics characteristics and it is far away from the equilibrium point. At the same time, there exist various of attractors, not only fixed point, limit cycle and torus, but also strange attractors.

Because neural network system is a super- high-dimensional and strongly non-linear dynamical systems, despite we have gotten some success, but the current theory and the basic understanding of its dynamic behavior is still not enough, in order to develop neural network theory much more better and to apply it, it is necessary to study the problem of dynamics further.

4.2 Symmetric Generalized Lorenz Strange Attractors The forced dissipative system discussed by Lorenz is:  4.7  Lorenz found that when Rayleigh number over a critical value, the system will behave chaotic.

4.2.1 Results and analysis Based on Lorenz equations giving the generalized form as follows:

Using the following formula to convert rectangular coordinates into polar coordinates: In accordance with the following formula to convert then rectangular coordinates back to polar coordinates:

Construct the strange attractor A of equation (4.8), the method is to choose an appropriate time interval: from equ.(4.11) can obtain the non-linear differential equation

Theorem 4.1 Construct the strange attractor A of equation (4.8), through (4.9) and (4.10) to complete coordinate transformation, then there is

Table 4.4 Construct strange attractor of generalized Lorenz equations:

(a1) (a2) (a3) (a4)

(b1)(b2) (b3) (b4)

(c)(d) (e) (f)(g)(h) Fig.4.6 Generalized Lorenz equations’ strange attractor

The ~ curve of generalized Lorenz ’ s strange attractor (a) (b)

(c) (d)

(e) (f)

(g) (h) Fig.4.7 The ~ curve of generalized Lorenz’s strange attractor

Table 4.6 Dim c D 2 of generalized Lorenz’ strange attractor 图标号图标号 嵌入维 数 m 关联维 数 D 2 均方误 差  平均值 D 2 图标号嵌入维 数 m 关联维 数 D 2 均方误差  平均值 D 2 4.6( a) E  2 1.2E   (e) E  2 3.4E   ( b) E  2 2.1E   (f) E  2 1.5E   ( c) E  2 3.7E   (g) E  2 2.3E   ( d) E  2 2.1E   (h) E  2 2.7E  

4.2.2 Conclusion Chaos is a complex form of motion of the dynamic system which is obey to a decisive equation (differential or discrete form). As May said in 1976 : "a simple dynamic system may be not lead to the easy nature of dynamics." Because chaos is generated from separation and collapse repeated, but separation and collapse is not one-to-one mapping (that is irreversible), so it only possible in non-linear. That is chaos can only appeared in non-linear systems.

The existence of chaos, not only relate to the non- linear characteristics of the system (the form of non- linear equations), but also relate to the parameters ’ value in the equation. E.g. there is not chaos in the Logistic map. Therefore the existence of chaos is often relate to the bifurcation of nonlinear systems. Because of exclusion and collapse, in chaos, the motion of the system (such as the iterative process of the representative point ) is often very sensitive to initial conditions: small differences in initial conditions is likely to cause the great differences in iterative process. We can see that the above conclusions is also right to the chaotic solution of differential equations.

4.3 Symmetric generalized R ö ssler strange attractors R ö ssler equation is a very simple non-linear ordinary differential equations given by R ö ssler in 1976, when he was studying the chemical reaction issues with Intermediate product, through appropriate scaling transformation. which give. Up to now, people have being studied the chaos generated from R ö ssler equation in order to enrich the chaos theory.

4.3.1 The results The generalized form based on R ö ssler equation is given as follows: Convert generalized R ö ssler equation into non-linear differential equation by the first- order differential equation algorithm.  4.14 

From eq.(4.15  can obtain the non-linear differential equation  4.15   4.16 

Theorem 4.2 Construct the strange attractor A of equation (4.14), use the coordinate transformation above, when,there is: Theorem 4.2 explains that if,then So strange attractors have the structure with characteristics of rotational symmetry.

Construct the strange attractor of generalized Rossler equations The selection of control parameters, the total number of sectors and the nested factor.

(a1)(a2) (a3)(a4)

(b1) (b2) (b3)(b4)

(c)(d) (e) (f)(g)(h) Fig.4.8 The strange attractor of generalized Rössler equations

The largest Lyapunov exponent 1 of generalized R ö ssler equation ’ s strange attractors:

The ~ curve of generalized R ö ssler ’ s strange attractor (a)(b)

(c) (d)

(e) (f)

(g) (h) Fig.4.9 The ~ curve of generalized Rossler’s strange attractor

table 4.9 Dim c D 2 of the generalized R ö ssler ’ s strange attractor

The size of Fractal dimension reflects the space occupied by generalized Rssler ’ s strange attractors which has the structure of fractal : The greater the dimension, the greater the space occupied by it, the more compact structure; Contrariwise, the more sparse structure.

4.3.2 Conclusion In this section, we constructed the generalized R ö ssler ’ s strange attractors which has rotational symmetry structure, and analyzed the characteristics of strange attractors ’ structure. As can be seen, the discrete mapping and continuous flow decide by the differential equations have some common laws, which shows there is intrinsic relationship between the discrete mapping and the differential equations.