Abstract

This paper proposes a novel selective federated learning (FL) algorithm, called fittest aggregation and slotted training (FedFaSt). It relies on a 'free-for-all' client training process to score clients' efficiency while applying the 'natural selection' principle to elect the fittest clients to be used in FL training and aggregation processes. While relying on a combined data quality and training performance metric for scoring clients, FedFaSt implements a slotted training model enabling teams of fittest clients to participate in the training and aggregation processes for a fixed number of successive rounds, called slots. Performance validation using X-ray datasets reveals that FedFaSt outperforms selective federated learning algorithms like FedAVG, FedRand, and FedPow in terms of accuracy, convergence to the global optimum, time complexity, and robustness against attacks.

Department(s)

Computer Science

Comments

National Science Foundation, Grant CNS-2008878

Keywords and Phrases

AI for Health; FedAVG; Federated Learning; FedFaSt; FedPow; Selective Federated Learning; X-Ray Dataset

International Standard Serial Number (ISSN)

2576-6813; 2334-0983

Document Type

Article - Conference proceedings

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2025 Institute of Electrical and Electronics Engineers, All rights reserved.

Publication Date

01 Jan 2023

Share

 
COinS