A Faster Model Selection Criterion for Op-Elm and Op-Knn: Hannan-Quinn Criterion

Abstract

The Optimally Pruned Extreme Learning Machine (Opelm) and Optimally Pruned K-Nearest Neighbors (Op-Knn) Algorithms Use the a Similar Methodology based on Random Initialization (Op-Elm) or Knn Initialization (Op-Knn) of a Feedforward Neural Network Followed by Ranking of the Neurons; Ranking is Used to Determine the Best Combination to Retain. This is Achieved by Leave-One-Out (Loo) Crossvalidation. in This Article is Proposed to Use the Hannan-Quinn (Hq) Criterion as a Model Selection Criterion, Instead of Loo. It Proved to Be Efficient and as Good as the Loo One for Both Op-Elm and Op-Knn, While Decreasing Computations by Factors of Four to Five for Op-Elm and Up to 24 for Op-Knn.

Department(s)

Engineering Management and Systems Engineering

International Standard Book Number (ISBN)

978-293030709-1

Document Type

Article - Conference proceedings

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2024 European Symposium on Artificial Neural Networks, All rights reserved.

Publication Date

01 Dec 2009

This document is currently not available here.

Share

 
COinS