Doctoral Dissertations

Keywords and Phrases

Natural language processing; Representation learning; Spatiotemporal heterogeneous networks


“The problem of learning latent representations of heterogeneous networks with spatial and temporal attributes has been gaining traction in recent years, given its myriad of real-world applications. Most systems with applications in the field of transportation, urban economics, medical information, online e-commerce, etc., handle big data that can be structured into Spatiotemporal Heterogeneous Networks (SHNs), thereby making efficient analysis of these networks extremely vital. In recent years, representation learning models have proven to be quite efficient in capturing effective lower-dimensional representations of data. But, capturing efficient representations of SHNs continues to pose a challenge for the following reasons: (i) Spatiotemporal data that is structured as SHN encapsulate complex spatial and temporal relationships that exist among real-world objects, rendering traditional feature engineering approaches inefficient and compute-intensive; (ii) Due to the unique nature of the SHNs, existing representation learning techniques cannot be directly adopted to capture their representations.

To address the problem of learning representations of SHNs, four novel frameworks that focus on their unique spatial and temporal characteristics are introduced: (i) collective representation learning, which focuses on quantifying the importance of each latent feature using Laplacian scores; (ii) modality aware representation learning, which learns from the complex user mobility pattern; (iii) distributed representation learning, which focuses on learning human mobility patterns by leveraging Natural Language Processing algorithms; and (iv) representation learning with node sense disambiguation, which learns contrastive senses of nodes in SHNs. The developed frameworks can help us capture higher-order spatial and temporal interactions of real-world SHNs. Through data-driven simulations, machine learning and deep learning models trained on the representations learned from the developed frameworks are proven to be much more efficient and effective”--Abstract, page iii.


Leopold, Jennifer

Committee Member(s)

McMillin, Bruce M.
Sabharwal, Chaman
Morales, Ricardo
Canfield, Casey I.


Computer Science

Degree Name

Ph. D. in Computer Science


Missouri University of Science and Technology

Publication Date

Spring 2022


xii, 104 pages

Note about bibliography

Includes bibliographic references (pages 97-103).


© 2022 Dakshak Keerthi Chandra, All rights reserved.

Document Type

Dissertation - Open Access

File Type




Thesis Number

T 12116