Indexed by:
Abstract:
Federated learning enables participating devices to learn a shared model. However, in real-world application scenarios, participating devices exhibit significant heterogeneity in terms of data distribution, computational resources, and communication resources. This often results in ineffective utilization of local updates from the devices in each iteration, ultimately leading to a substantial decline in the convergence rate and speed of federated learning. To address the variability in data distribution and communication efficiency among participating devices, a Clustering-based Adaptive Weight SelectionAggregation framework (C-AW) is introduced. This framework dynamically clusters the participating devices, adjusts the relative weights between the clusters, and intelligently selects client devices to participate in the subsequent rounds of federated learning based on the cluster weights. The effectiveness of C-AW has been experimentally validated. When using deep neural networks on the CIFAR-10 dataset, C-AW demonstrates superior performance in both training time and accuracy compared to the Federated Averaging algorithm (FEDAVG) as well as other selection and weighting learning methods. The experimental results confirm that the proposed C-AW framework can effectively enhance the efficiency and accuracy of federated learning in heterogeneous environments. © 2024 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2024
Page: 406-409
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 11
Affiliated Colleges: