Abstract
Feature Selection is Essential in Many Machines Learning Problem, But It is Often Not Clear on Which Grounds Variables Should Be Included or Excluded. This Paper Shows that the Mean Squared Leave-One-Out Error of the First Nearest-Neighbour Estimator is Effective as a Cost Function When Selecting Input Variables for Regression Tasks. a Theoretical Analysis of the Estimator's Properties is Presented to Support its Use for Feature Selection. an Experimental Comparison to Alternative Selection Criteria (Including Mutual Information, Least Angle Regression, and the Rrelieff Algorithm) Demonstrates Reliable Performance on Several Regression Tasks.
Recommended Citation
E. Eirola et al., "The Delta Test: The 1-Nn Estimator as a Feature Selection Criterion," Proceedings of the International Joint Conference on Neural Networks, pp. 4214 - 4222, article no. 6889560, Institute of Electrical and Electronics Engineers, Sep 2014.
The definitive version is available at https://doi.org/10.1109/IJCNN.2014.6889560
Department(s)
Engineering Management and Systems Engineering
International Standard Book Number (ISBN)
978-147991484-5
Document Type
Article - Conference proceedings
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2024 Institute of Electrical and Electronics Engineers, All rights reserved.
Publication Date
03 Sep 2014