FedDelta: Incremental federated learning for heterogeneous data using dynamic leader election.
Loading...
Identifiers
Publication date
Reading date
Collaborators
Advisors
Tutors
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Universidad de Málaga
Share
Center
Department/Institute
Abstract
Federated learning enables collaborative model training across multiple devices or organizations without sharing raw data, thereby addressing privacy and data ownership concerns. However, most existing federated approaches rely on centralized coordination and often struggle to maintain robustness and convergence stability in scenarios with heterogeneous or unbalanced data. In this work, we propose FedDelta, a federated multivariable linear regression method that achieves strong performance across varying data quantities and distributions, including scenarios with unbalanced data across participants. FedDelta employs a closed-form ridge regression solution and a decentralized communication scheme, in which participating peers dynamically assume coordination roles to aggregate model updates efficiently, eliminating the need for a central server. We evaluate FedDelta on real-world datasets and show that it achieves competitive accuracy and robustness compared to centralized and traditional federated methods. Our results highlight FedDelta’s potential for privacy-preserving, scalable learning in resource-constrained and distributed environments.
Description
Bibliographic citation
García-Luque, R., Pimentel, E., Durán, F., Iroslavov, I., & Carreira, E. R. (2026). FedDelta: Incremental federated learning for heterogeneous data using dynamic leader election. Manuscript submitted for publication.
Collections
Endorsement
Review
Supplemented By
Referenced by
Creative Commons license
Except where otherwised noted, this item's license is described as Attribution-NonCommercial-NoDerivatives 4.0 International













