Abstract

Federated Learning Is A Training Framework That Enables Multiple Participants To Collaboratively Train A Shared Model While Preserving Data Privacy. The Heterogeneity Of Devices And Networking Resources Of The Participants Delay The Training And Aggregation. The Paper Introduces A Novel Approach To Federated Learning By Incorporating Resource-Aware Clustering. This Method Addresses The Challenges Posed By The Diverse Devices And Networking Resources Among Participants. Unlike Static Clustering Approaches, This Paper Proposes A Dynamic Method To Determine The Optimal Number Of Clusters Using Dunn Indices. It Enables Adaptability To The Varying Heterogeneity Levels Among Participants, Ensuring A Responsive And Customized Approach To Clustering. Next, The Paper Goes Beyond Empirical Observations By Providing A Mathematical Derivation Of The Communication Rounds For Convergence Within Each Cluster. Further, The Participant Assignment Mechanism Adds A Layer Of Sophistication And Ensures That Devices And Networking Resources Are Allocated Optimally. Afterwards, We Incorporate A Master-Slave Technique, Particularly Through Knowledge Distillation, Which Improves The Performance Of Lightweight Models Within Clusters. Finally, Experiments Are Conducted To Validate The Approach And To Compare It With State-Of-The-Art. The Results Demonstrated An Accuracy Improvement Of Over 3% Compared To Its Closest Competitor And A Reduction In Communication Rounds Of Around 10%.

Department(s)

Computer Science

Publication Status

Early Access

Keywords and Phrases

Adaptation models; Computational modeling; Federated learning; Federated learning; Heterogeneity; Master-slave technique; Mathematical models; Performance evaluation; Resource aware clustering; Servers; Training

International Standard Serial Number (ISSN)

1558-2183; 1045-9219

Document Type

Article - Journal

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2024 Institute of Electrical and Electronics Engineers, All rights reserved.

Publication Date

01 Jan 2024

Share

 
COinS