Compressive Elm: Improved Models through Exploiting Time-Accuracy Trade-Offs

Abstract

In the Training of Neural Networks, There Often Exists a Trade-Off between the Time Spent Optimizing the Model under Investigation, and its Final Performance. Ideally, an Optimization Algorithm Finds the Model that Has Best Test Accuracy from the Hypothesis Space as Fast as Possible, and This Model is Efficient to Evaluate at Test Time as Well. However, in Practice, There Exists a Trade-Off between Training Time, Testing Time and Testing Accuracy, and the Optimal Trade-Off Depends on the User's Requirements. This Paper Proposes the Compressive Extreme Learning Machine, Which Allows for a Time-Accuracy Trade-Off by Training the Model in a Reduced Space. Experiments Indicate that This Trade-Off is Efficient in the Sense that on Average More Time Can Be Saved Than Accuracy Lost. Therefore, It Provides a Mechanism that Can Yield Better Models in Less Time. © Springer International Publishing Switzerland 2014.

Department(s)

Engineering Management and Systems Engineering

Keywords and Phrases

approximate matrix decompositions; compressive sensing; ELM; Extreme Learning Machine; Johnson-Lindenstrauss; random projection

International Standard Book Number (ISBN)

978-331911070-7

International Standard Serial Number (ISSN)

1865-0929

Document Type

Article - Conference proceedings

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2024 Springer, All rights reserved.

Publication Date

01 Jan 2014

Share

 
COinS