Collaborative Inference in RIS-Assisted MEC Networks under Computing Backlog Constraints
In this paper, we analyze collaborative inference in a mobile edge computing (MEC) network aided by a reconfigurable intelligent surface (RIS). In particular, we consider multiple user equipments (UEs) with collaborative inference tasks that require the execution of deep neural networks. The goal is...
Saved in:
Published in | IEEE transactions on communications p. 1 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
IEEE
2025
|
Subjects | |
Online Access | Get full text |
ISSN | 0090-6778 1558-0857 |
DOI | 10.1109/TCOMM.2025.3564727 |
Cover
Loading…
Summary: | In this paper, we analyze collaborative inference in a mobile edge computing (MEC) network aided by a reconfigurable intelligent surface (RIS). In particular, we consider multiple user equipments (UEs) with collaborative inference tasks that require the execution of deep neural networks. The goal is to minimize the long-term average energy consumption subject to a long-term average computing queue backlog constraint. We first transform the considered problem into a Lyapunov optimization problem and then propose a deep reinforcement learning (DRL)-based algorithm to solve it. An optimization subroutine is embedded in the proposed algorithm to directly obtain the optimal RIS coefficients, while the UEs' deep neural network (DNN) partition decisions and computational resource allocations at the MEC server are obtained from the DRL-based algorithm. Via numerical results, it is shown that the proposed algorithm solves the problem efficiently, and the introduced RIS improves the long-term average energy consumption significantly. Furthermore, it is demonstrated that system parameters (such as communication bandwidth and the maximum CPU frequency at the MEC server) can have significant impact on the energy consumption and computing backlog levels. |
---|---|
ISSN: | 0090-6778 1558-0857 |
DOI: | 10.1109/TCOMM.2025.3564727 |