Maximizing Joint Data Rate and Resource Efficiency in D2D-IoT Enabled Multi-Tier Networks
The next-generation wireless network is expected to be highly dense with a large number of Device-to-Device (D2D) communication enabled IoT devices in fog computing based cellular networks. The dense deployment of the heterogeneous network architecture is expected to fulfill the smart devices' growing data demand, lower power consumption, and lower latency constraint. The 5G technology is expected to have such multi-tier architecture with various computational capability and radio enabled IoT devices. However, the coexistence of such heterogeneous network model spawns research challenges such as interference management, non-uniform computational capacity with non-uniform devices connectivity and service deadline. Thus, in this paper, we propose a coexistence of D2D-IoT (D-IoT) and fog computing model in cellular networks and formulate the resource allocation problem in such a multi-tier architecture considering different kinds of interference, data rate, and latency altogether as an optimization problem and further propose a distributed many-to-many stable matching based solution. Through extensive theoretical and simulation analysis, we have shown the effect of different parameters on the resource allocation objectives and achieve more than 94% of optimum network performance.
A. Pratap et al., "Maximizing Joint Data Rate and Resource Efficiency in D2D-IoT Enabled Multi-Tier Networks," Proceedings - Conference on Local Computer Networks, LCN, pp. 177 - 184, Oct 2019.
The definitive version is available at https://doi.org/10.1109/LCN44214.2019.8990781
Conference on Local Computer Networks, LCN
Center for High Performance Computing Research
Keywords and Phrases
5G; D-IoT; FAP; HetNet; Stable matching
International Standard Book Number (ISBN)
Article - Conference proceedings
© 2019 , All rights reserved.
01 Oct 2019
National Science Foundation, Grant CCF-1725755