Abstract
Word Embeddings Are a Low-Dimensional Vector Representation of Words that Incorporates Context. Two Popular Methods Are Word2vec and Global Vectors (Glove). Word2vec is a Single-Hidden Layer Feedforward Neural Network (Slfn) that Has an Auto-Encoder Influence for Computing a Word Context Matrix using Backpropagation for Training. Glove Computes the Word Context Matrix First Then Performs Matrix Factorization on the Matrix to Arrive at Word Embeddings. Backpropagation is a Typical Training Method for SLFN's, which is Time Consuming and Requires Iterative Tuning. Extreme Learning Machines (Elm) Have the Universal Approximation Capability of SLFN's, based on a Randomly Generated Hidden Layer Weight Matrix in Lieu of Backpropagation. in This Research, We Propose an Efficient Method for Generating Word Embeddings that Uses an Auto-Encoder Architecture based on Elm that Works on a Word Context Matrix. Word Similarity is Done using the Cosine Similarity Measure on a Dozen Various Words and the Results Are Reported.
Recommended Citation
P. Lauren et al., "A Low-Dimensional Vector Representation for Words using an Extreme Learning Machine," Proceedings of the International Joint Conference on Neural Networks, pp. 1817 - 1822, article no. 7966071, Institute of Electrical and Electronics Engineers, Jun 2017.
The definitive version is available at https://doi.org/10.1109/IJCNN.2017.7966071
Department(s)
Engineering Management and Systems Engineering
International Standard Book Number (ISBN)
978-150906181-5
Document Type
Article - Conference proceedings
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2024 Institute of Electrical and Electronics Engineers, All rights reserved.
Publication Date
30 Jun 2017
Comments
National Science Foundation, Grant NSF-1614024