Design Techniques For The Control Of Errors In Backpropagation Neural Networks
A significant problem in the design and construction of an artificial neural network for function approximation is limiting the magnitude and variance of errors when the network is used in the field. Network errors can occur when the training data does not faithfully represent the required function due to noise or low sampling rates, when the network's flexibility does not match the variability of the data, or when the input data to the resultant network is noisy. This paper reports on several experiments whose purpose was to rank the relative significance of these error sources and thereby find neural network design principles for limiting the magnitude and variance of network errors.
D. C. St Clair et al., "Design Techniques For The Control Of Errors In Backpropagation Neural Networks," Proceedings of SPIE - The International Society for Optical Engineering, vol. 1966, pp. 372 - 383, Society of Photo-optical Instrumentation Engineers, Aug 1993.
The definitive version is available at https://doi.org/10.1117/12.152636
Mathematics and Statistics
International Standard Serial Number (ISSN)
Article - Conference proceedings
© 2023 Society of Photo-optical Instrumentation Engineers, All rights reserved.
19 Aug 1993