Federated Split Learning for Distributed Intelligence with Resource-Constrained Devices

As a distributed machine learning paradigm, federated learning usually requires all edge devices to collaboratively train a large-size artificial intelligence model at local. However, this imposes challenges for these resource-constrained Internet of Things (IoT) devices. Moreover, the communication...

Full description

Saved in:
Bibliographic Details
Published in2024 IEEE International Conference on Communications Workshops (ICC Workshops) pp. 798 - 803
Main Authors Ao, Huiqing, Tian, Hui, Ni, Wanli
Format Conference Proceeding
LanguageEnglish
Published IEEE 09.06.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:As a distributed machine learning paradigm, federated learning usually requires all edge devices to collaboratively train a large-size artificial intelligence model at local. However, this imposes challenges for these resource-constrained Internet of Things (IoT) devices. Moreover, the communication overhead between IoT devices and the base station is highly significant for the emerging big model-based tasks. In this paper, we propose a novel framework called federated split learning (FedSL), which considers the heterogeneity and resource scarcity of IoT devices. To reduce the training delay and energy consumption in resource-constrained wireless networks, we formulate a mixed-integer non-linear programming problem by jointly optimizing the power allocation, device scheduling and split layer selection. Then, we design an alternating optimization algorithm to solve the formulated problem with a low computational complexity. The simulation results demonstrate that the FedSL framework outperforms the current state-of-the-art benchmarks, highlighting the importance and superiority of device scheduling in resource-constrained IoT networks.
ISSN:2694-2941
DOI:10.1109/ICCWorkshops59551.2024.10615309