Download presentation
Presentation is loading. Please wait.
1
1 Neuron-Adaptive Higher Order Neural Network Models for Automated Financial Data Modeling Dr. Ming Zhang, Associate Professor Department of Physics, Computer Science & Engineering Christopher Newport University 1 University Place, Newport News, VA 23606, USA
2
2 Published IEEE Transactions on Neural Networks Vol. 13 No. 1 January 2002
3
3 Problems Real-world financial data is often non- linear, high-frequency, multi- polynomial components, and is discontinuous (piecewise continuous). Classical neural network models are unable to automatically determine the optimum model and appropriate order for financial data approximation.
4
4 PHONN Simulator (1994 - 1996) - Polynomial Higher Order Neural Network financial data simulator - A$ 105,000 Supported by Fujitsu, Japan THONN Simulator (1996 - 1998) - Trigonometric polynomial Higher Order Neural Network financial data simulator - A$ 10,000 Supported by Australia Research Council PT-HONN Simulator (1999 - 2000) - Polynomial and Trigonometric polynomial Higher Order Neural Network financial data simulator - US$ 46,000 Supported by USA National Research Council
5
5 PT-HONN Data Simulator
6
6 Simulating by PT-HONN Simulator
7
7 Structure of PT-HONN
8
8 PT-HONN MODEL The network architecture of PT-HONN has combined both the characteristics of PHONN and THONN. It is a multi-layer network that consists of an input layer with input-units, and output layer with output-units, and two hidden layers consisting of intermediate processing units.
9
9 Definition of PT-HONN
10
10 * The network architecture of NAHONN is a multi-layer feed-forward network that consists of an input layer with input-units, an output layer with output-units, and one hidden layer consisting of intermediate processing units. * There is no activation function in the input layer and the output neurons are summing units (linear activation) * our activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF) NAHONN
11
11 The activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF) defined as where a1,b1,a2,b2,a3 and b3 are real variable which will be adjusted (as well as weights) during training. NAAF
12
12 Structure of NAHONN
13
13 NAHONN Group Neuron-Adaptive Feed-forward Higher Order Neural network Group (NAFNG) is one kind of neural network group in which each element is a neuron-adaptive feed-forward higher order neural network (F i ). We have: NAFNG ={F 1, F 2, F 3,…... F i,…...F n }
14
14 Hornik, K. (1991) “Whenever the activation function is continuous, bounded and non-constant, then for an arbitrary compact subset X R n, standard multi-layer feed-forward networks can approximate any continuous function on X arbitrarily well with respect to uniform distance, provided that sufficiently many hidden units are available”
15
15 Leshno, M. (1993) “A standard multi-layer feed-forward network with a locally bounded activation function can approximate any continuous function to any degree of accuracy if and only if the network’s activation function is not a polynomial”
16
16 Zhang, Ming (1995) “ Consider a neural network piecewise function group, in which each member is a standard multi-layer feed-forward neural network, and which has locally bounded, piecewise continuous (rather than polynomial) activation function and threshold. Each such group can approximate any king of piecewise continuous function, and to any degree of accuracy”
17
17 Feature of NAHONN A neuron-adaptive feed-forward neural network group with adaptive neurons can approximate any kind of piecewise continuous function.
18
18 Conclusion We proved that a single NAHONN can approximate any piecewise continuous function, to any desired accuracy. The experimental results show that NAHONN can handle high-frequency data, model multi-polynomial data, simulate discontinuous data, and are capable of approximating any kind of piecewise continuous functions, to any degree of accuracy.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.