Abstract

Feature Selection is Essential in Many Machines Learning Problem, But It is Often Not Clear on Which Grounds Variables Should Be Included or Excluded. This Paper Shows that the Mean Squared Leave-One-Out Error of the First Nearest-Neighbour Estimator is Effective as a Cost Function When Selecting Input Variables for Regression Tasks. a Theoretical Analysis of the Estimator's Properties is Presented to Support its Use for Feature Selection. an Experimental Comparison to Alternative Selection Criteria (Including Mutual Information, Least Angle Regression, and the Rrelieff Algorithm) Demonstrates Reliable Performance on Several Regression Tasks.

Department(s)

Engineering Management and Systems Engineering

International Standard Book Number (ISBN)

978-147991484-5

Document Type

Article - Conference proceedings

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2024 Institute of Electrical and Electronics Engineers, All rights reserved.

Publication Date

03 Sep 2014

Share

 
COinS