Federated multi-task learning with cross-device heterogeneous task subsets
Traditional Federated Learning (FL) predominantly focuses on task-consistent scenarios, assuming clients possess identical tasks or task sets. However, in multi-task scenarios, client task sets can vary greatly due to their operating environments, available resources, and hardware configurations. Co...
Saved in:
Published in | Journal of parallel and distributed computing Vol. 205; p. 105155 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Inc
01.11.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Traditional Federated Learning (FL) predominantly focuses on task-consistent scenarios, assuming clients possess identical tasks or task sets. However, in multi-task scenarios, client task sets can vary greatly due to their operating environments, available resources, and hardware configurations. Conventional task-consistent FL cannot address such heterogeneity effectively. We define this statistical heterogeneity of task sets, where each client performs a unique subset of server tasks, as cross-device task heterogeneity. In this work, we propose a novel Federated Partial Multi-task (FedPMT) method, allowing clients with diverse task sets to collaborate and train comprehensive models suitable for any task subset. Specifically, clients deploy partial multi-task models tailored to their localized task sets, while the server utilizes single-task models as an intermediate stage to address the model heterogeneity arising from differing task sets. Collaborative training is facilitated through bidirectional transformations between them. To alleviate the negative transfer caused by task set disparities, we introduce task attenuation factors to modulate the influence of different tasks. This adjustment enhances the performance and task generalization ability of cloud models, promoting models to converge towards a shared optimum across all task subsets. Extensive experiments conducted on the NYUD-v2, PASCAL Context and Cityscapes datasets validate the effectiveness and superiority of FedPMT.
•We propose cross-device task heterogeneity, a collaborative scenario for federated clients with heterogeneous task sets.•We propose FedPMT, enabling clients with diverse task sets to collaboratively train cloud models, with proven convergence.•We use different model architectures locally and in the cloud for their tasks, enabling collaboration through alignment.•We propose task attenuation factors to promote task collaboration, enabling cloud models to converge to shared optima.•Extensive experiments verify the effectiveness of our method in various heterogeneous task set scenarios. |
---|---|
ISSN: | 0743-7315 |
DOI: | 10.1016/j.jpdc.2025.105155 |