Partially Collaborative Edge Caching based on Federated Deep Reinforcement Learning

In this paper, edge caching is investigated in fog radio access networks subject to dynamic content popularity. In order to improve the long-term cache-hit-ratio (CHR), a federated deep reinforcement learning (FDRL) framework is proposed, in which multiple fog access points (FAPs) adjust their cachi...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on vehicular technology Vol. 72; no. 1; pp. 1 - 6
Main Authors Lei, Meng, Li, Qiang, Ge, Xiaohu, Pandharipande, Ashish
Format Journal Article
LanguageEnglish
Published New York IEEE 01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, edge caching is investigated in fog radio access networks subject to dynamic content popularity. In order to improve the long-term cache-hit-ratio (CHR), a federated deep reinforcement learning (FDRL) framework is proposed, in which multiple fog access points (FAPs) adjust their caching strategies under the coordination of a central server (CS). Owing to the non-i.i.d data acquired over different FAPs, the traditional federated learning (FL) method may suffer from performance loss due to maintaining a global model at the CS. To address this shortcoming, a partially collaborative caching (PCC) algorithm is proposed by switching the data training between two models that are maintained at each FAP, to achieve a balance between the users' specific local characteristics and holistic global characteristics. Experiments on a real-world dataset demonstrate that significant performance gains are achieved by the proposed FDRL in terms of CHR. Furthermore, with an appropriate switching factor, the proposed PCC algorithm outperforms FL with full collaboration among FAPs.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9545
1939-9359
DOI:10.1109/TVT.2022.3206876