Indexed by:
Abstract:
Federated learning (FL) is a distributed machine learning framework that enables the training of shared models without the need to share local data. However, FL faces challenges in heterogeneous Internet of Things (IoT) environments, including communication bottleneck, staleness, and non- independent and identically distributed (non-IID) data. To tackle these challenges, we present ASAFL, an Asynchronous Federated Learning framework with an Adaptive Scheduling Strategy. Firstly, we quantify the potential contribution of client models relative to the server model. Secondly, we design a client upload strategy to reduce the uploading of redundant models with low contribution. Finally, we propose a server model update method based on the contributions to address model divergence caused by staleness and non-IID data. Furthermore, our theoretical analysis guarantees ASAFL's convergence, and experiments show it reduces communication overhead by over 70% compared to traditional asynchronous FL.
Keyword:
Reprint Author's Address:
Email:
Source :
INFORMATION SCIENCES
ISSN: 0020-0255
Year: 2024
Volume: 689
8 . 1 0 0
JCR@2022
Cited Count:
WoS CC Cited Count: 1
SCOPUS Cited Count: 2
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 12
Affiliated Colleges: