Clustered Graph Federated Personalized Learning

This paper proposes a graph federated learning approach wherein multiple servers collaborate to enhance personalized learning over clustered clients, essentially performing correlated learning tasks. In contrast to earlier approaches, relying on a cluster-dedicated server topology, the proposed graph federated multitask learning (GFedMt) framework adopts a more general setting wherein clients of the same cluster are distributed across servers. In order to address problems with unbalanced client distributions among servers and clusters as well as data shortage of isolated clients, servers perform intra-cluster and inter-cluster learning collaboratively through local interaction with neighboring servers. Clients use the alternating direction method of multipliers (ADMM) to learn their local models. Numerical simulations demonstrate the ability of the proposed method to ensure fast and accurate convergence when data is scarce.

[1]  Vinay Chakravarthi Gogineni,et al.  Communication-Efficient Online Federated Learning Strategies for Kernel Regression , 2023, IEEE Internet of Things Journal.

[2]  Vinay Chakravarthi Gogineni,et al.  Personalized Online Federated Learning for IoT/CPS: Challenges and Future Directions , 2022, IEEE Internet of Things Magazine.

[3]  Vinay Chakravarthi Gogineni,et al.  Decentralized Graph Federated Multitask Learning for Streaming Data , 2022, 2022 56th Annual Conference on Information Sciences and Systems (CISS).

[4]  Anthony Kuh,et al.  Resource-Aware Asynchronous Online Federated Learning for Nonlinear Regression , 2021, ICC 2022 - IEEE International Conference on Communications.

[5]  Geoffrey Y. Li,et al.  Communication-Efficient ADMM-based Federated Learning , 2021, ArXiv.

[6]  Fabio Galasso,et al.  Cluster-driven Graph Federated Learning over Multiple Domains , 2021, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[7]  Ali H. Sayed,et al.  A Graph Federated Architecture with Privacy Preserving Learning , 2021, 2021 IEEE 22nd International Workshop on Signal Processing Advances in Wireless Communications (SPAWC).

[8]  Qiang Yang,et al.  Towards Personalized Federated Learning , 2021, IEEE Transactions on Neural Networks and Learning Systems.

[9]  Bo Yang,et al.  Edge Intelligence for Autonomous Driving in 6G Wireless System: Design Challenges and Solutions , 2020, IEEE Wireless Communications.

[10]  A. Salman Avestimehr,et al.  Byzantine-Resilient Secure Federated Learning , 2020, IEEE Journal on Selected Areas in Communications.

[11]  K. Ramchandran,et al.  An Efficient Framework for Clustered Federated Learning , 2020, IEEE Transactions on Information Theory.

[12]  Mrityunjoy Chakraborty,et al.  Improving the Performance of Multitask Diffusion APA via Controlled Inter-Cluster Cooperation , 2020, IEEE Transactions on Circuits and Systems I: Regular Papers.

[13]  Li Chen,et al.  Robust Federated Learning With Noisy Communication , 2019, IEEE Transactions on Communications.

[14]  H. Vincent Poor,et al.  Federated Learning With Differential Privacy: Algorithms and Performance Analysis , 2019, IEEE Transactions on Information Forensics and Security.

[15]  Anit Kumar Sahu,et al.  Federated Learning: Challenges, Methods, and Future Directions , 2019, IEEE Signal Processing Magazine.

[16]  S. H. Song,et al.  Client-Edge-Cloud Hierarchical Federated Learning , 2019, ICC 2020 - 2020 IEEE International Conference on Communications (ICC).

[17]  Wotao Yin,et al.  Splitting Methods in Communication, Imaging, Science, and Engineering , 2017 .

[18]  Peter Richtárik,et al.  Federated Optimization: Distributed Machine Learning for On-Device Intelligence , 2016, ArXiv.

[19]  Blaise Agüera y Arcas,et al.  Communication-Efficient Learning of Deep Networks from Decentralized Data , 2016, AISTATS.

[20]  Jie Chen,et al.  Multitask Diffusion Adaptation Over Networks , 2013, IEEE Transactions on Signal Processing.