Abstract

Backpropagation neural networks are trained by adjusting initially random interconnecting weights according to the steepest local error surface gradient. The authors examine the practical implications of the arbitrary starting point on the error landscape of the ensuing trained network. The effects on network convergence and performance are tested empirically, varying parameters such as network size, training rate, transfer function and data representation. The data used are live process control data from an injection molding plant

Meeting Name

1991 IEEE International Conference on Systems, Man, and Cybernetics, 1991. 'Decision Aiding for Complex Systems'

Department(s)

Engineering Management and Systems Engineering

Keywords and Phrases

Backpropagation Error Surface; Data Representation; Error Landscape; Injection Molding Process Control; Learning Systems; Network Convergence; Neural Nets; Neural Networks; Plastics Industry; Process Computer Control; Random Interconnecting Weights; Training Rate; Transfer Function

Document Type

Article - Conference proceedings

Document Version

Final Version

File Type

text

Language(s)

English

Rights

© 1991 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.

Publication Date

01 Jan 1991

Share

 
COinS