Abstract

Word Embeddings Are a Low-Dimensional Vector Representation of Words that Incorporates Context. Two Popular Methods Are Word2vec and Global Vectors (Glove). Word2vec is a Single-Hidden Layer Feedforward Neural Network (Slfn) that Has an Auto-Encoder Influence for Computing a Word Context Matrix using Backpropagation for Training. Glove Computes the Word Context Matrix First Then Performs Matrix Factorization on the Matrix to Arrive at Word Embeddings. Backpropagation is a Typical Training Method for SLFN's, which is Time Consuming and Requires Iterative Tuning. Extreme Learning Machines (Elm) Have the Universal Approximation Capability of SLFN's, based on a Randomly Generated Hidden Layer Weight Matrix in Lieu of Backpropagation. in This Research, We Propose an Efficient Method for Generating Word Embeddings that Uses an Auto-Encoder Architecture based on Elm that Works on a Word Context Matrix. Word Similarity is Done using the Cosine Similarity Measure on a Dozen Various Words and the Results Are Reported.

Department(s)

Engineering Management and Systems Engineering

Comments

National Science Foundation, Grant NSF-1614024

International Standard Book Number (ISBN)

978-150906181-5

Document Type

Article - Conference proceedings

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2024 Institute of Electrical and Electronics Engineers, All rights reserved.

Publication Date

30 Jun 2017

Share

 
COinS