Federated Learning with Mutually Cooperating Devices: A Consensus Approach Towards Server-Less Model Optimization

Federated learning (FL) is emerging as a new paradigm for training a machine learning model in cooperative networks. The model parameters are optimized collectively by large populations of interconnected devices, acting as cooperative learners that exchange local model updates with the server, rather than user data. The FL framework is however centralized, as it relies on the server for fusion of the model updates and as such it is limited by a single point of failure. In this paper we propose a distributed FL approach that performs a decentralized fusion of local model parameters by leveraging mutual cooperation between the devices and local (in-network) data operations via consensus-based methods. Communication with the server can be partially, or fully, replaced by in-network operations, scaling down the traffic load on the server as well as paving the way towards a fully serverless FL approach. This proposal also lays the groundwork for integration of FL methods within future (beyond 5G) wireless networks characterized by distributed and decentralized connectivity. The proposed algorithms are implemented and published as open source. They are also designed and verified by experimental data.

[1]  Monica Nicoli,et al.  Federated Learning With Cooperating Devices: A Consensus Approach for Massive IoT Networks , 2019, IEEE Internet of Things Journal.

[2]  Alejandro Ribeiro,et al.  Consensus in Ad Hoc WSNs With Noisy Links—Part I: Distributed Estimation of Deterministic Signals , 2008, IEEE Transactions on Signal Processing.

[3]  Hubert Eichner,et al.  Towards Federated Learning at Scale: System Design , 2019, SysML.

[4]  Ali H. Sayed,et al.  Analysis of Spatial and Incremental LMS Processing for Distributed Estimation , 2011, IEEE Transactions on Signal Processing.

[5]  Walid Saad,et al.  Federated Learning for Ultra-Reliable Low-Latency V2V Communications , 2018, 2018 IEEE Global Communications Conference (GLOBECOM).

[6]  Reza Olfati-Saber,et al.  Consensus and Cooperation in Networked Multi-Agent Systems , 2007, Proceedings of the IEEE.

[7]  Abhinav Vishnu,et al.  GossipGraD: Scalable Deep Learning using Gossip Communication based Asynchronous Gradient Descent , 2018, ArXiv.

[8]  Emiliano Sisinni,et al.  A Wireless Cloud Network Platform for Industrial Process Automation: Critical Data Publishing and Distributed Sensing , 2017, IEEE Transactions on Instrumentation and Measurement.

[9]  H. Vincent Poor,et al.  Ultrareliable and Low-Latency Wireless Communication: Tail, Risk, and Scale , 2018, Proceedings of the IEEE.

[10]  Robert D. Nowak,et al.  Quantized incremental algorithms for distributed optimization , 2005, IEEE Journal on Selected Areas in Communications.

[11]  Ali H. Sayed,et al.  Diffusion strategies for adaptation and learning over networks: an examination of distributed strategies and network behavior , 2013, IEEE Signal Processing Magazine.

[12]  Léon Bottou,et al.  Large-Scale Machine Learning with Stochastic Gradient Descent , 2010, COMPSTAT.

[13]  Giorgio Battistelli,et al.  Consensus Labeled Random Finite Set Filtering for Distributed Multi-Object Tracking , 2015, ArXiv.

[14]  Umberto Spagnolini,et al.  Consensus-Based Algorithms for Distributed Network-State Estimation and Localization , 2017, IEEE Transactions on Signal and Information Processing over Networks.

[15]  Vittorio Rampa,et al.  Device-Free RF Human Body Fall Detection and Localization in Industrial Workplaces , 2017, IEEE Internet of Things Journal.

[16]  Kin K. Leung,et al.  Adaptive Federated Learning in Resource Constrained Edge Computing Systems , 2018, IEEE Journal on Selected Areas in Communications.

[17]  Umberto Spagnolini,et al.  Distributed signal processing for dense 5G IoT platforms: Networking, synchronization, interference detection and radio sensing , 2019, Ad Hoc Networks.

[18]  Wei Yu,et al.  A Survey of Deep Learning: Platforms, Applications and Emerging Research Trends , 2018, IEEE Access.

[19]  Peter Richtárik,et al.  Federated Learning: Strategies for Improving Communication Efficiency , 2016, ArXiv.

[20]  Matthieu Cord,et al.  Gossip training for deep learning , 2016, ArXiv.

[21]  Mehdi Bennis,et al.  Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data , 2018, ArXiv.

[22]  Vittorio Rampa,et al.  Safe human-robot cooperation through sensor-less radio localization , 2014, 2014 12th IEEE International Conference on Industrial Informatics (INDIN).

[23]  Blaise Agüera y Arcas,et al.  Communication-Efficient Learning of Deep Networks from Decentralized Data , 2016, AISTATS.

[24]  Nicholas D. Lane,et al.  Squeezing Deep Learning into Mobile and Embedded Devices , 2017, IEEE Pervasive Computing.