Interpreting Extreme Learning Machine as an Approximation to an Infinite Neural Network

Abstract

Extreme Learning Machine (Elm) is a Neural Network Architecture in Which Hidden Layer Weights Are Randomly Chosen and Output Layer Weights Determined Analytically. We Interpret Elm as an Approximation to a Network with Infinite Number of Hidden Units. the Operation of the Infinite Network is Captured by Neural Network Kernel (Nnk). We Compare Elm and Nnk Both as Part of a Kernel Method and in Neural Network Contextsights Gained from This Analysis Lead Us to Strongly Recommend Model Selection Also on the Variance of Elm Hidden Layer Weights, and Not Only on the Number of Hidden Units, as is Usually Done with Elm. We Also Discuss Some Properties of Elm, Which May Have Been Too Strongly Interpreted in Previous Works.

Department(s)

Engineering Management and Systems Engineering

Keywords and Phrases

Extreme learning machine (ELM); Neural network kernel

International Standard Book Number (ISBN)

978-989842528-7

Document Type

Article - Conference proceedings

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2024 SciTePress, All rights reserved.

Publication Date

01 Dec 2010

This document is currently not available here.

Share

 
COinS