Rescaling Of Variables In Back Propagation Learning
Use of the logistic derivative in backward error propagation suggests one source of ill-conditioning to be the decreasing multiplier in the computation of the elements of the gradient at each layer. A compensatory rescaling is suggested, based heuristically upon the expected value of the multiplier. Experimental results demonstrate an order of magnitude improvement in convergence. © 1991.
A. K. Rigler et al., "Rescaling Of Variables In Back Propagation Learning," Neural Networks, vol. 4, no. 2, pp. 225 - 229, Elsevier, Jan 1991.
The definitive version is available at https://doi.org/10.1016/0893-6080(91)90006-Q
Mathematics and Statistics
Keywords and Phrases
Backward error propagation; Layered networks; Preconditioning; Rescaling
International Standard Serial Number (ISSN)
Article - Journal
© 2023 Elsevier, All rights reserved.
01 Jan 1991