Abstract

Representation learning is a technique that is used to capture the underlying latent features of complex data. Representation learning on networks has been widely implemented for learning network structure and embedding it in a low dimensional vector space. In recent years, network embedding using representation learning has attracted increasing attention, and many deep architectures have been widely proposed. However, existing network embedding techniques ignore the multi-class spatial and temporal relationships that crucially reflect the complex nature among vertices and links in spatiotemporal heterogeneous information networks(SHINs).

To address this problem, in this paper, we present two types of collective representation learning models for spatiotemporal heterogeneous information network embedding (SHNE). 1) We propose a model called Multilingual SHNE (M-SHNE); the proposed model leverages the use of random walks along with multilingual word embedding technique used in natural language processing (NLP) to collectively learn the spatiotemporal proximity measures between vertices in SHINs and preserve it in a low dimensional vector space. 2) We propose a second method called Meta path Constrained Random walk SHNE (MCR-SHNE) that combines the advantage of meta path counting algorithm, path constrained random walks, and word embedding technique to generate lower dimensional embeddings that preserve the spatiotemporal proximity measures in SHINs. Experimental results demonstrate the effectiveness of our two proposed models over state-of-the-art algorithms on real-world datasets.

Meeting Name

27th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems, ACM SIGSPATIAL GIS 2019 (2019: Nov. 5-8, Chicago, IL)

Department(s)

Computer Science

Keywords and Phrases

Distributional semantics; Meta paths; Proximity measures; Random walks; Semantic representation

International Standard Book Number (ISBN)

978-145036909-1

Document Type

Article - Conference proceedings

Document Version

Final Version

File Type

text

Language(s)

English

Rights

© 2019 The Authors, All rights reserved.

Publication Date

01 Nov 2019

Share

 
COinS