Mobility-Aware Caching in D2D Networks

Caching at mobile devices can facilitate device-todevice (D2D) communications, which may significantly improve spectrum efficiency and alleviate the heavy burden on backhaul links. However, most previous works ignored user mobility, thus having limited practical applications. In this paper, we take...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on wireless communications Vol. 16; no. 8; pp. 5001 - 5015
Main Authors Rui Wang, Jun Zhang, Song, S. H., Letaief, Khaled B.
Format Journal Article
LanguageEnglish
Published New York IEEE 01.08.2017
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Caching at mobile devices can facilitate device-todevice (D2D) communications, which may significantly improve spectrum efficiency and alleviate the heavy burden on backhaul links. However, most previous works ignored user mobility, thus having limited practical applications. In this paper, we take advantage of the user mobility pattern by the inter-contact times between different users, and propose a mobility-aware caching placement strategy to maximize the data offloading ratio, which is defined as the percentage of the requested data that can be delivered via D2D links rather than through base stations. Given the NP-hard caching placement problem, we first propose an optimal dynamic programming algorithm to obtain a performance benchmark with much lower complexity than exhaustive search. We then prove that the problem falls in the category of monotone submodular maximization over a matroid constraint, and propose a time-efficient greedy algorithm, which achieves an approximation ratio as 2/1. Simulation results with real-life data sets will validate the effectiveness of our proposed mobility-aware caching placement strategy. We observe that users moving at either a very low or very high speed should cache the most popular files, while users moving at a medium speed should cache less popular files to avoid duplication.
ISSN:1536-1276
1558-2248
DOI:10.1109/TWC.2017.2705038