Quantcast

Approximation of state-space trajectories by locally recurrent globally feed-forward neural networks.

Research paper by Krzysztof K Patan

Indexed on: 26 Dec '07Published on: 26 Dec '07Published in: Neural Networks



Abstract

The paper deals with investigating approximation abilities of a special class of discrete-time dynamic neural networks. The networks considered are called locally recurrent globally feed-forward, because they are designed with dynamic neuron models which contain inner feedbacks, but interconnections between neurons are strict feed-forward ones like in the well-known multi-layer perceptron. The paper presents analytical results showing that a locally recurrent network with two hidden layers is able to approximate a state-space trajectory produced by any Lipschitz continuous function with arbitrary accuracy. Moreover, based on these results, the network can be simplified and transformed into a more practical structure needed in real world applications.