Rescaling Of Variables In Back Propagation Learning
Abstract
Use of the logistic derivative in backward error propagation suggests one source of ill-conditioning to be the decreasing multiplier in the computation of the elements of the gradient at each layer. A compensatory rescaling is suggested, based heuristically upon the expected value of the multiplier. Experimental results demonstrate an order of magnitude improvement in convergence. © 1991.
Recommended Citation
A. K. Rigler et al., "Rescaling Of Variables In Back Propagation Learning," Neural Networks, vol. 4, no. 2, pp. 225 - 229, Elsevier, Jan 1991.
The definitive version is available at https://doi.org/10.1016/0893-6080(91)90006-Q
Department(s)
Mathematics and Statistics
Keywords and Phrases
Backward error propagation; Layered networks; Preconditioning; Rescaling
International Standard Serial Number (ISSN)
0893-6080
Document Type
Article - Journal
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2023 Elsevier, All rights reserved.
Publication Date
01 Jan 1991
Comments
Office of Naval Research, Grant N00014-88-K-0659