Extending Extreme Learning Machine with Combination Layer
Abstract
We Consider the Extreme Learning Machine Model for Accurate Regression Estimation and the Related Problem of Selecting the Appropriate Number of Neurons for the Model. Selection Strategies that Choose "The Best" Model from a Set of Candidate Network Structures Neglect the Issues of Model Selection Uncertainty. to Alleviate the Problem, We Propose to Remove This Selection Phase with a Combination Layer that Takes into Account All Considered Models. the Proposed Method in This Paper is the Extreme Learning Machine(Jackknife Model Averaging), Where Jackknife Model Averaging is a Combination Method based on Leave-One-Out Residuals of Linear Models. the Combination Approach is Shown to Have Better Predictive Performance on Several Real-World Data Sets. © 2013 Springer-Verlag Berlin Heidelberg.
Recommended Citation
D. Sovilj et al., "Extending Extreme Learning Machine with Combination Layer," Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 7902 LNCS, no. PART 1, pp. 417 - 426, Springer, Jul 2013.
The definitive version is available at https://doi.org/10.1007/978-3-642-38679-4_41
Department(s)
Engineering Management and Systems Engineering
International Standard Book Number (ISBN)
978-364238678-7
International Standard Serial Number (ISSN)
1611-3349; 0302-9743
Document Type
Article - Conference proceedings
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2024 Springer, All rights reserved.
Publication Date
17 Jul 2013