Chapter 4 Chaos in the High-dimensional Nonlinear System
CONTENT 4.1 Chaos in the Neural Network 4.2 Symmetric Generalized Lorenz Strange Attractors 4.3 Symmetric generalized Rossler strange attractors
4.1.1 The emergency and develop- ment of Neural Network Theory Artificial Neural Network has become an important aspect of scientific research in nonlinear. The research was started early in The study of Artificial Neural Network began in 1962, put forward by the Rosenblatt with perceptron model.
The renaissance of neural network research is in 1980s. Neurons link model, parallel and distributed processing seems to be the only way to the brand new development of computer. With the emergence of supercomputer, the study of Neural Networks Theory has made great achievements.
Neural network has been demonstrated potential capacity in a number of aspects. In 1986, it was discovered there is chaotic phenomenon in the brain. From 1920s, people began to study the neural network mix with chaos, that is to research chaos phenomenon in the brain.
4.1.2 Chaotic neural network research 1. What the role of Chaos in our brain? Babloyantz etc. think Chaos can enhance the resonance capacity of the brain so it can have a very extensive response to outside stimulates. Nicolis thinks chaos is the of self- reference logic generator.
Amit recognizes chaos not only won ’ t hamper the study to new models, but probably only if there is no chaos, it may only deepen the previous models but not study new models with memory. Tauda has proved that a chaotic neural network composed by low-dimensional chaotic dynamics has a strong ability to convey information from outside effectively, especially for time-varying external input information.
2. With the study of chaotic neural network, people have been trying to use it chaos to achieve some kind of functional: Ikeguchi etc. try to use the chaotic neural network on associative memory. Parodi et al used the chaotic behavior of the pseudo-random to create the noise model corresponds to the certain input mode, and to achieve information coding.
Nara et al using complex dynamics of non- symmetric recursive neural network for the search of complexity model, to prove that the complex orbit of the chaotic search based on the activation state space is more effective than random search, and the dynamic structure of chaotic orbit play an important role on the effectiveness of search behavior.
Yoshizawa studied the of non- monotonic and activity neural network, and shown that chaos can be used to eliminate the pseudo-memory state of network, when the network can not remember the memory rightly, it will play a chaotic behavior instead of giving the results of pseudo-memory.
3.The approach of generating chaos in neural network system: Up to now, it has been found that it can be generated from period-doubling bifurcation, homoclinic and heteroclinic orbit. Besides, the Rulle-Takens-Newhouse Road is also a usual way to generate chaos in neural network.
4.1.3 Physiological structure of neural network Neurons are basic units to compose neural networks. Although the diversity and micro-structure of neurons, but from the information processing point of view, it can be seen as a basic information processing unit, composed by the cell body, axons and dendrites. As shown in Fig. 4.1:
Neural network is characterized by prominent large-scale processing units and their mutual association, although the unit is simple, because of non-linear, its behavior can be very complicated, and there is the capacity of parallel and distributed processing.
From the dynamic point of view, the brain's neural network system is the form of a large number of neurons linked by a high degree of complex super-high-dimensional nonlinear system, the main functions of the network, the forward study and feedback associative memory can both be achieved by its dynamics subsystem (the state value dynamic systems and the weight dynamic subsystems).
These excellent functions of neural network can be achieved by the evolution of neural network dynamics. Artificial neural network is based on the study of physiology of the human brain with the aim of achieving some certain functions by simulating the mechanism of human brain.
4.1.4 Three-feedback neural network model This section will study the three-feedback neural network model which has the chaotic characteristics: namely, using the criterion of Lyapunov exponent to construct strange attractors of neural network structure, then analyzing its characteristics of moving path and calculating dim c of the strange attractor, and at last seeking the application based on the research.
The input layer containing D units of, the hidden layer with N units of, and output layer containing D units of. The transform function of hidden layer is, the output of network is a linear combination of hidden layer ’ s outputs, that is, 4.1 4.2
In eq.(4.1), tanh is defined by: 4.3 With every iteration, each output is fed back to the corresponding input on it, it can be expressed as 4.4
The strange attractor of neural network The network can output D infinite length of numerical sequence of, because , for this reason the author did not use instead of as independent variables ’ time series to construct attractor. Parameters are selected as follows: choose N=4, so that is the transverse and is the longitudinal coordinates, respectively. expresses the height (in the lower right corner of the map projection), is used to map a linear color palette with 16 kinds of colors; initially:
Attempts to replace the transform function tanh of hidden layer in the network with other functions such as Asymmetric Logistic Maps 4.5 To construct strange attractor by the method mentioned above. Figure 4.3 is a set of representative results:
(a)the transformation function of hidden layer is equation (8) (b) the transformation function of hidden layer is equation (8)
(c) the transformation function of hidden layer is equation (8) (d) the transformation function of hidden layer is equation (8)
(e) the transformation function of hidden layer is equation (10) (f) the transformation function of hidden layer is equation (10)
(g)the transformation function of hidden layer is equation (10) (h) the transformation function of hidden layer is equation (10)
4.1.5 Quantitative analysis of strange attractors of neural network According to the method put forward by Wolf et al, the author used FORTRAN programmed the calculation procedures to compute 1, to improve the computation speed of looking for the points to replace, the algorithm of quick sort is used. In order to verify the correctness of the procedure, the authors calculated 1 of the mapping Hnon: Mapping Hnon 4.6
Table 4.1 shows the compute results of 1 of neural network ’ s strange attractors: Table of the strange attractor
Table 4.2 shows the D2 of dim c in mapping Hnon
Fig 4.4 ~ curve of mapping Hnon : 1 — m=4 ; 2 — m=5 。
Use network to iterative 90,000 times and obtain the attractor time series of abscissa, select 2000 data points, the author show the attractor in Figure 4.3 as Figure 4.5, the ~ curve of strange attractors. the results of D2 is shown in Table 4.3.
~ curve of the strange attractor of neural network (a)(b)
(c)(d)
(e)(f)
(g)(h) Fig. 4.5 ~ curve of the strange attractor of neural network
Table 4.3 dim c D 2 of Neural Network ’ s strange attractor
The size of fractal dimension reflects the extent of space, which is used by the fractal structure of the neural network strange attractors: The larger the dimension of strange attractor, the greater space it occupies, the more dense structure and more complex systems; In contrast, the more sparse structure, the more easy systems.
4.1.6 Conclusion In this section, the author studied the dynamic behavior of the artificial neural network which is chaotic, and results showed that: different from the conventional neural network with the only characteristics of gradient descent, the chaotic neural network has richer dynamics characteristics and it is far away from the equilibrium point. At the same time, there exist various of attractors, not only fixed point, limit cycle and torus, but also strange attractors.
Because neural network system is a super- high-dimensional and strongly non-linear dynamical systems, despite we have gotten some success, but the current theory and the basic understanding of its dynamic behavior is still not enough, in order to develop neural network theory much more better and to apply it, it is necessary to study the problem of dynamics further.
4.2 Symmetric Generalized Lorenz Strange Attractors The forced dissipative system discussed by Lorenz is: 4.7 Lorenz found that when Rayleigh number over a critical value, the system will behave chaotic.
4.2.1 Results and analysis Based on Lorenz equations giving the generalized form as follows:
Using the following formula to convert rectangular coordinates into polar coordinates: In accordance with the following formula to convert then rectangular coordinates back to polar coordinates:
Construct the strange attractor A of equation (4.8), the method is to choose an appropriate time interval: from equ.(4.11) can obtain the non-linear differential equation
Theorem 4.1 Construct the strange attractor A of equation (4.8), through (4.9) and (4.10) to complete coordinate transformation, then there is
Table 4.4 Construct strange attractor of generalized Lorenz equations:
(a1) (a2) (a3) (a4)
(b1)(b2) (b3) (b4)
(c)(d) (e) (f)(g)(h) Fig.4.6 Generalized Lorenz equations’ strange attractor
The ~ curve of generalized Lorenz ’ s strange attractor (a) (b)
(c) (d)
(e) (f)
(g) (h) Fig.4.7 The ~ curve of generalized Lorenz’s strange attractor
Table 4.6 Dim c D 2 of generalized Lorenz’ strange attractor 图标号图标号 嵌入维 数 m 关联维 数 D 2 均方误 差 平均值 D 2 图标号嵌入维 数 m 关联维 数 D 2 均方误差 平均值 D 2 4.6( a) E 2 1.2E (e) E 2 3.4E ( b) E 2 2.1E (f) E 2 1.5E ( c) E 2 3.7E (g) E 2 2.3E ( d) E 2 2.1E (h) E 2 2.7E
4.2.2 Conclusion Chaos is a complex form of motion of the dynamic system which is obey to a decisive equation (differential or discrete form). As May said in 1976 : "a simple dynamic system may be not lead to the easy nature of dynamics." Because chaos is generated from separation and collapse repeated, but separation and collapse is not one-to-one mapping (that is irreversible), so it only possible in non-linear. That is chaos can only appeared in non-linear systems.
The existence of chaos, not only relate to the non- linear characteristics of the system (the form of non- linear equations), but also relate to the parameters ’ value in the equation. E.g. there is not chaos in the Logistic map. Therefore the existence of chaos is often relate to the bifurcation of nonlinear systems. Because of exclusion and collapse, in chaos, the motion of the system (such as the iterative process of the representative point ) is often very sensitive to initial conditions: small differences in initial conditions is likely to cause the great differences in iterative process. We can see that the above conclusions is also right to the chaotic solution of differential equations.
4.3 Symmetric generalized R ö ssler strange attractors R ö ssler equation is a very simple non-linear ordinary differential equations given by R ö ssler in 1976, when he was studying the chemical reaction issues with Intermediate product, through appropriate scaling transformation. which give. Up to now, people have being studied the chaos generated from R ö ssler equation in order to enrich the chaos theory.
4.3.1 The results The generalized form based on R ö ssler equation is given as follows: Convert generalized R ö ssler equation into non-linear differential equation by the first- order differential equation algorithm. 4.14
From eq.(4.15 can obtain the non-linear differential equation 4.15 4.16
Theorem 4.2 Construct the strange attractor A of equation (4.14), use the coordinate transformation above, when,there is: Theorem 4.2 explains that if,then So strange attractors have the structure with characteristics of rotational symmetry.
Construct the strange attractor of generalized Rossler equations The selection of control parameters, the total number of sectors and the nested factor.
(a1)(a2) (a3)(a4)
(b1) (b2) (b3)(b4)
(c)(d) (e) (f)(g)(h) Fig.4.8 The strange attractor of generalized Rössler equations
The largest Lyapunov exponent 1 of generalized R ö ssler equation ’ s strange attractors:
The ~ curve of generalized R ö ssler ’ s strange attractor (a)(b)
(c) (d)
(e) (f)
(g) (h) Fig.4.9 The ~ curve of generalized Rossler’s strange attractor
table 4.9 Dim c D 2 of the generalized R ö ssler ’ s strange attractor
The size of Fractal dimension reflects the space occupied by generalized Rssler ’ s strange attractors which has the structure of fractal : The greater the dimension, the greater the space occupied by it, the more compact structure; Contrariwise, the more sparse structure.
4.3.2 Conclusion In this section, we constructed the generalized R ö ssler ’ s strange attractors which has rotational symmetry structure, and analyzed the characteristics of strange attractors ’ structure. As can be seen, the discrete mapping and continuous flow decide by the differential equations have some common laws, which shows there is intrinsic relationship between the discrete mapping and the differential equations.