Temporal Back-Propagation Algorithm Andrew J. Low
Problem Statement Program a Back-Propagation Algorithm to train a Multilayer Perceptron Programming Language: C++ Distributed Time Lagged Feedforward Network
Distributed vs. Focused Networks Additional Neurons added at input for memory Suitable for shift invariant series Distributed Network Memory added to each neuron in the form of additional inputs Allows for time variant series
Algorithm Detail Output Layer: Hidden Layer:
Results Program is incomplete at this time Advantages Disadvantages Broader range of problems can be solved Disadvantages Requires more processing power and memory Functionality Read & Write Config Files, Config options including layers, memory depth, learning rate CLI Interface (Windows and Possibly Unix) Testing Series: