Ant Colony Optimization Applied to the Training of a High Order Neural Networks with Adaptable Exponential Weights
Abstract
High order neural networks (HONN) are neural networks which employ neurons that combine their inputs non-linearly. The HONEST (High Order Network with Exponential SynapTic links) network is a HONN that uses neurons with product units and adaptable exponents. The output of a trained HONEST network can be expressed in terms of the network inputs by a polynomial-like equation. This makes the structure of the network more transparent and easier to interpret. This study adapts ACOR, an Ant Colony Optimization algorithm, to the training of an HONEST network. Using a collection of 10 widely-used benchmark datasets, we compare ACOR to the well-known gradient-based Resilient Propagation (R-Prop) algorithm, in the training of HONEST networks. We find that our adaptation of ACOR has better test set generalization than R-Prop, though not to a statistically significant extent.
Recommended Citation
A. M. Abdelbar et al., "Ant Colony Optimization Applied to the Training of a High Order Neural Networks with Adaptable Exponential Weights," Applied Artificial Higher Order Neural Networks for Control and Recognition, pp. 362 - 374, IGI Global, May 2016.
The definitive version is available at https://doi.org/10.4018/978-1-5225-0063-6.ch014
Department(s)
Electrical and Computer Engineering
Research Center/Lab(s)
Center for High Performance Computing Research
Second Research Center/Lab
Intelligent Systems Center
International Standard Book Number (ISBN)
978-1522500636
Document Type
Book - Chapter
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2016 IGI Global, All rights reserved.
Publication Date
01 May 2016
Comments
Chapter 14