Extending Extreme Learning Machine with Combination Layer

Abstract

We Consider the Extreme Learning Machine Model for Accurate Regression Estimation and the Related Problem of Selecting the Appropriate Number of Neurons for the Model. Selection Strategies that Choose "The Best" Model from a Set of Candidate Network Structures Neglect the Issues of Model Selection Uncertainty. to Alleviate the Problem, We Propose to Remove This Selection Phase with a Combination Layer that Takes into Account All Considered Models. the Proposed Method in This Paper is the Extreme Learning Machine(Jackknife Model Averaging), Where Jackknife Model Averaging is a Combination Method based on Leave-One-Out Residuals of Linear Models. the Combination Approach is Shown to Have Better Predictive Performance on Several Real-World Data Sets. © 2013 Springer-Verlag Berlin Heidelberg.

Department(s)

Engineering Management and Systems Engineering

International Standard Book Number (ISBN)

978-364238678-7

International Standard Serial Number (ISSN)

1611-3349; 0302-9743

Document Type

Article - Conference proceedings

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2024 Springer, All rights reserved.

Publication Date

17 Jul 2013

Share

 
COinS