Abstract

Optimizing feed forward Neural Networks for Nonlinear Dynamic System Identification with Adaptive Learning rate and Pruning Algorithm


Abstract


For effective training and enhancing any ANN's capacity for generalization, structure optimization is crucial. In this work, an effective a pruning strategy was created to identify nonlinear dynamic systems by optimizing the feed forward neural network's structure. The gradient-based back propagation technique is used to update the FFNN weights. A threshold is set and the weights between the input, hidden and output layers below the threshold limit are eliminated. A novel adaptive learning rate is created by calculating the score index Si after each epoch thereby increasing the speed of the algorithm. A nonlinear benchmark problem is used to demonstrate the effectiveness of the suggested algorithm. A comparative study of the algorithm is also done with simple FFNN with ALR and PFFNN with ALR.




Keywords


FFNN, pruning algorithm, gradient BP based ANN