Feature Selection for Nonlinear Models with Extreme Learning Machines
Abstract
In the Context of Feature Selection, There is a Trade-Off between the Number of Selected Features and the Generalisation Error. Two Plots May Help to Summarise Feature Selection: The Feature Selection Path and the Sparsity-Error Trade-Off Curve. the Feature Selection Path Shows the Best Feature Subset for Each Subset Size, Whereas the Sparsity-Error Trade-Off Curve Shows the Corresponding Generalisation Errors. These Graphical Tools May Help Experts to Choose Suitable Feature Subsets and Extract Useful Domain Knowledge. in Order to Obtain These Tools, Extreme Learning Machines Are Used Here, Since They Are Fast to Train and an Estimate of their Generalisation Error Can Easily Be Obtained using the Press Statistics. an Algorithm is Introduced, Which Adds an Additional Layer to Standard Extreme Learning Machines in Order to Optimise the Subset of Selected Features. Experimental Results Illustrate the Quality of the Presented Method. © 2012 Elsevier B.v.
Recommended Citation
F. Benoît et al., "Feature Selection for Nonlinear Models with Extreme Learning Machines," Neurocomputing, vol. 102, pp. 111 - 124, Elsevier, Feb 2013.
The definitive version is available at https://doi.org/10.1016/j.neucom.2011.12.055
Department(s)
Engineering Management and Systems Engineering
Keywords and Phrases
Extreme learning machines; Feature selection; Regression; Regularisation
International Standard Serial Number (ISSN)
1872-8286; 0925-2312
Document Type
Article - Journal
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2024 Elsevier, All rights reserved.
Publication Date
15 Feb 2013