Indexed by:
Abstract:
In the face of large-scale, diverse, and time-evolving data, as well as machine learning tasks in industrial production processes, a Federated Incremental Learning(FIL) and optimization method based on information entropy is proposed in this paper. Within the federated framework, local computing nodes utilize local data for model training, and compute the average entropy to be transmitted to the server to assist in identifying class-incremental tasks. The global server then selects local nodes for current round training based on the locally provided average entropy and makes decisions on task incrementality, followed by global model deployment and aggregation updates. The proposed method combines average entropy and thresholds for nodes selection in various situations, achieving stable model learning under low average entropy and incremental model expansion under high average entropy. Additionally, convex optimization is employed to adaptively adjust aggregation frequency and resource allocation in resource-constrained scenarios, ultimately achieving effective model convergence. Simulation results demonstrate that the proposed method accelerates model convergence and enhances training accuracy in different scenarios. © 2024 Science Press. All rights reserved.
Keyword:
Reprint Author's Address:
Email:
Source :
Journal of Electronics and Information Technology
ISSN: 1009-5896
Year: 2024
Issue: 8
Volume: 46
Page: 3146-3154
Cited Count:
SCOPUS Cited Count: 1
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 4
Affiliated Colleges: