Abstract
This paper proposes a novel selective federated learning (FL) algorithm, called fittest aggregation and slotted training (FedFaSt). It relies on a 'free-for-all' client training process to score clients' efficiency while applying the 'natural selection' principle to elect the fittest clients to be used in FL training and aggregation processes. While relying on a combined data quality and training performance metric for scoring clients, FedFaSt implements a slotted training model enabling teams of fittest clients to participate in the training and aggregation processes for a fixed number of successive rounds, called slots. Performance validation using X-ray datasets reveals that FedFaSt outperforms selective federated learning algorithms like FedAVG, FedRand, and FedPow in terms of accuracy, convergence to the global optimum, time complexity, and robustness against attacks.
Recommended Citation
F. Kahenga et al., "Fedfast: Selective Federated Learning using Fittest Parameters Aggregation and Slotted Clients Training," Proceedings - IEEE Global Communications Conference, GLOBECOM, pp. 3879 - 3884, Institute of Electrical and Electronics Engineers, Jan 2023.
The definitive version is available at https://doi.org/10.1109/GLOBECOM54140.2023.10437003
Department(s)
Computer Science
Keywords and Phrases
AI for Health; FedAVG; Federated Learning; FedFaSt; FedPow; Selective Federated Learning; X-Ray Dataset
International Standard Serial Number (ISSN)
2576-6813; 2334-0983
Document Type
Article - Conference proceedings
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2025 Institute of Electrical and Electronics Engineers, All rights reserved.
Publication Date
01 Jan 2023
Comments
National Science Foundation, Grant CNS-2008878