MLMG: Multi-Local and Multi-Global Model Aggregation for Federated Learning

Federated learning has attracted much interest and attention as a solution to collaboratively learn a prediction model without sharing the training data of users. Existing federated learning approaches usually develop a single independent local model for each client to train their privacy-sensitive data, afterward adopt a single centralized global model to exchange the trained parameters of clients that participate in federated training. However, given the diverse characteristics of local data and the heterogeneity across participating clients, the conventional federated learning paradigm may not achieve uniformly good performance over all users. In this work, we propose a novel federated learning mechanism which suggests using a Multi-Local and Multi-Global (MLMG) model aggregation to train the non-IID user data with clustering methods. Then a Matching algorithm is introduced to derive the appropriate exchanges between local models and global models. The new federated learning mechanism helps separate the data and user with different characteristics, thus makes it easier to capture the heterogeneity of data distributions across the users. We choose the latest on-device neural network for anomaly detection to evaluate the proposal, and experimental results based on several benchmark datasets demonstrate better detection accuracy (up to 2.83% accuracy improvement) of the novel paradigm compared with a conventional federated learning approach.

[1]  Blaise Agüera y Arcas,et al.  Communication-Efficient Learning of Deep Networks from Decentralized Data , 2016, AISTATS.

[2]  Rich Caruana,et al.  Multitask Learning , 1998, Encyclopedia of Machine Learning and Data Mining.

[3]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[4]  Sebastian Caldas,et al.  LEAF: A Benchmark for Federated Settings , 2018, ArXiv.

[5]  Mehrdad Mahdavi,et al.  Adaptive Personalized Federated Learning , 2020, ArXiv.

[6]  Yohei Kawaguchi,et al.  Description and Discussion on DCASE2020 Challenge Task2: Unsupervised Anomalous Sound Detection for Machine Condition Monitoring , 2020, ArXiv.

[7]  Zhiqiong Wang,et al.  Elastic extreme learning machine for big data classification , 2015, Neurocomputing.

[8]  Guang-Bin Huang,et al.  Extreme learning machine: a new learning scheme of feedforward neural networks , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[9]  Bingsheng He,et al.  A Survey on Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection , 2019, IEEE Transactions on Knowledge and Data Engineering.

[10]  Narasimhan Sundararajan,et al.  A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks , 2006, IEEE Transactions on Neural Networks.

[11]  Hiroki Matsutani,et al.  A Neural Network-Based On-Device Learning Anomaly Detector for Edge Devices , 2019, IEEE Transactions on Computers.

[12]  Yang Qin,et al.  A Selective Model Aggregation Approach in Federated Learning for Online Anomaly Detection , 2020, 2020 International Conferences on Internet of Things (iThings) and IEEE Green Computing and Communications (GreenCom) and IEEE Cyber, Physical and Social Computing (CPSCom) and IEEE Smart Data (SmartData) and IEEE Congress on Cybermatics (Cybermatics).

[13]  Ying-Chang Liang,et al.  Federated Learning in Mobile Edge Networks: A Comprehensive Survey , 2020, IEEE Communications Surveys & Tutorials.

[14]  Kannan Ramchandran,et al.  Robust Federated Learning in a Heterogeneous Environment , 2019, ArXiv.

[15]  Ameet Talwalkar,et al.  Federated Multi-Task Learning , 2017, NIPS.

[16]  Qing Ling,et al.  RSA: Byzantine-Robust Stochastic Aggregation Methods for Distributed Learning from Heterogeneous Datasets , 2018, AAAI.

[17]  Anit Kumar Sahu,et al.  FedDANE: A Federated Newton-Type Method , 2019, 2019 53rd Asilomar Conference on Signals, Systems, and Computers.

[18]  Multi-Center Federated Learning , 2020, ArXiv.

[19]  An On-Device Federated Learning Approach for Cooperative Anomaly Detection , 2020, ArXiv.

[20]  Jean-Philippe Vert,et al.  Clustered Multi-Task Learning: A Convex Formulation , 2008, NIPS.

[21]  Sergei Vassilvitskii,et al.  Local Search Methods for k-Means with Outliers , 2017, Proc. VLDB Endow..

[22]  Shi Li,et al.  Constant approximation for k-median and k-means with outliers via iterative rounding , 2017, STOC.

[23]  Hiroki Matsutani,et al.  An Adaptive Abnormal Behavior Detection using Online Sequential Learning , 2019, 2019 IEEE International Conference on Computational Science and Engineering (CSE) and IEEE International Conference on Embedded and Ubiquitous Computing (EUC).

[24]  Wojciech Samek,et al.  Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[25]  Sunav Choudhary,et al.  Federated Learning with Personalization Layers , 2019, ArXiv.