Abstract
The performance of the neural network classifier significantly depends on its architecture and generalization. It is usual to find the proper architecture by trial and error. This is time consuming and may not always find the optimal network. For this reason, we apply genetic algorithms to the automatic generation of neural networks. Many researchers have provided that combining multiple classifiers improves generalization. One of the most effective combining methods is bagging. In bagging, training sets are selected by resampling from the original training set and classifiers trained with these sets are combined by voting. We implement the bagging technique into the training of evolving neural network classifiers to improve generalization.
Recommended Citation
S. Sohn and C. H. Dagli, "Combining Evolving Neural Network Classifiers Using Bagging," Proceedings of the International Joint Conference on Neural Networks, 2003, Institute of Electrical and Electronics Engineers (IEEE), Jan 2003.
The definitive version is available at https://doi.org/10.1109/IJCNN.2003.1224088
Meeting Name
International Joint Conference on Neural Networks, 2003
Department(s)
Engineering Management and Systems Engineering
Keywords and Phrases
Bagging; Evolving Neural Network Classifiers; Generalisation (Artificial Intelligence); Generalization; Genetic Algorithms; Learning (Artificial Intelligence); Neural Net Architecture; Neural Net Training; Pattern Classification
International Standard Serial Number (ISSN)
1098-7576
Document Type
Article - Conference proceedings
Document Version
Final Version
File Type
text
Language(s)
English
Rights
© 2003 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.
Publication Date
01 Jan 2003