Background
Type: Article

Improving backpropagation via an efficient combination of a saturation suppression method and momentum term

Journal: Neural Network World (23364335)Year: 2010Volume: 20Issue: Pages: 207 - 222
Moallem P.a Ayoughi S.A.
Language: English

Abstract

The gradient descent backpropagation (BP) algorithm that is widely used for training MLP neural networks can retard convergence due to certain features of the error surface like the local minimum and the flat spot. Common promoting methods, such as applying momentum term and using dynamic adaptation of learning rates, can enhance the performance of BP. However, saturation state of hidden layer neurons, which is the cause of some flat spots on the error surface, persists through such accelerating methods. In this paper, we propose a grading technique to gradually level off the potential flat spots into a sloping surface in a look-ahead mode; and thereby progressively renew saturated hidden neurons. We introduce symptoms indicating saturation state of hidden nodes. In order to suppress the saturation, we added a modifying term to the error function only when saturation is detected. In normal conditions, the improvement made to the learning process is adding a momentum term to the weight correction formula. We have recorded remarkable improvements in a selection of experiments. ©ICS AS CR 2010.