A survey of federated learning for edge computing: Research problems and solutions

[1]  Spyridon Bakas,et al.  Federated learning in medicine: facilitating multi-institutional collaborations without sharing patient data , 2020, Scientific Reports.

[2]  Zheng Zhang,et al.  MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems , 2015, ArXiv.

[3]  Walid Saad,et al.  Federated Learning for Edge Networks: Resource Optimization and Incentive Mechanism , 2019, IEEE Communications Magazine.

[4]  Kannan Ramchandran,et al.  Byzantine-Robust Distributed Learning: Towards Optimal Statistical Rates , 2018, ICML.

[5]  Cynthia Dwork,et al.  Differential Privacy: A Survey of Results , 2008, TAMC.

[6]  Jinyuan Jia,et al.  Local Model Poisoning Attacks to Byzantine-Robust Federated Learning , 2019, USENIX Security Symposium.

[7]  Trishul M. Chilimbi,et al.  Project Adam: Building an Efficient and Scalable Deep Learning Training System , 2014, OSDI.

[8]  Kin K. Leung,et al.  Energy-Efficient Radio Resource Allocation for Federated Edge Learning , 2019, 2020 IEEE International Conference on Communications Workshops (ICC Workshops).

[9]  H. Vincent Poor,et al.  Federated Learning With Differential Privacy: Algorithms and Performance Analysis , 2019, IEEE Transactions on Information Forensics and Security.

[10]  Martin J. Wainwright,et al.  Information-theoretic lower bounds for distributed statistical estimation with communication constraints , 2013, NIPS.

[11]  Rahul Garg,et al.  Gradient descent with sparsification: an iterative algorithm for sparse recovery with restricted isometry property , 2009, ICML '09.

[12]  Xiao Jin,et al.  VAFL: a Method of Vertical Asynchronous Federated Learning , 2020, ArXiv.

[13]  Tian Li,et al.  Fair Resource Allocation in Federated Learning , 2019, ICLR.

[14]  Mehdi Bennis,et al.  Harnessing Wireless Channels for Scalable and Privacy-Preserving Federated Learning , 2020, IEEE Transactions on Communications.

[15]  Kenneth Heafield,et al.  Sparse Communication for Distributed Gradient Descent , 2017, EMNLP.

[16]  Rachid Guerraoui,et al.  Machine Learning with Adversaries: Byzantine Tolerant Gradient Descent , 2017, NIPS.

[17]  Vitaly Shmatikov,et al.  How To Backdoor Federated Learning , 2018, AISTATS.

[18]  Katsuhiro Temma,et al.  Cloudlets Activation Scheme for Scalable Mobile Edge Computing with Transmission Power Control and Virtual Machine Migration , 2018, IEEE Transactions on Computers.

[19]  Ali Farhadi,et al.  XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks , 2016, ECCV.

[20]  Song Han,et al.  Learning both Weights and Connections for Efficient Neural Network , 2015, NIPS.

[21]  Yan Zhang,et al.  Differentially Private Asynchronous Federated Learning for Mobile Edge Computing in Urban Informatics , 2020, IEEE Transactions on Industrial Informatics.

[22]  Qun Li,et al.  A Survey of Virtual Machine Management in Edge Computing , 2019, Proceedings of the IEEE.

[23]  Marc'Aurelio Ranzato,et al.  Large Scale Distributed Deep Networks , 2012, NIPS.

[24]  Spyridon Bakas,et al.  Multi-Institutional Deep Learning Modeling Without Sharing Patient Data: A Feasibility Study on Brain Tumor Segmentation , 2018, BrainLes@MICCAI.

[25]  Leslie Lamport,et al.  The Byzantine Generals Problem , 1982, TOPL.

[26]  Ran El-Yaniv,et al.  Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations , 2016, J. Mach. Learn. Res..

[27]  H. Vincent Poor,et al.  Scheduling Policies for Federated Learning in Wireless Networks , 2019, IEEE Transactions on Communications.

[28]  Qun Li,et al.  Security and Privacy Issues of Fog Computing: A Survey , 2015, WASA.

[29]  Amir Houmansadr,et al.  Comprehensive Privacy Analysis of Deep Learning: Passive and Active White-box Inference Attacks against Centralized and Federated Learning , 2018, 2019 IEEE Symposium on Security and Privacy (SP).

[30]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[31]  Qun Li,et al.  Defenses Against Byzantine Attacks in Distributed Deep Neural Networks , 2021, IEEE Transactions on Network Science and Engineering.

[32]  Georgios B. Giannakis,et al.  Communication-Efficient Distributed Learning via Lazily Aggregated Quantized Gradients , 2019, NeurIPS.

[33]  Dusit Niyato,et al.  Resource Allocation in Mobility-Aware Federated Learning Networks: A Deep Reinforcement Learning Approach , 2019, 2020 IEEE 6th World Forum on Internet of Things (WF-IoT).

[34]  Yiran Chen,et al.  Learning Structured Sparsity in Deep Neural Networks , 2016, NIPS.

[35]  Miao Pan,et al.  Federated Learning in Vehicular Edge Computing: A Selective Model Aggregation Approach , 2020, IEEE Access.

[36]  Tassilo Klein,et al.  Differentially Private Federated Learning: A Client Level Perspective , 2017, ArXiv.

[37]  Aryan Mokhtari,et al.  FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization , 2019, AISTATS.

[38]  Blaine Nelson,et al.  Poisoning Attacks against Support Vector Machines , 2012, ICML.

[39]  J. Doug Tygar,et al.  Adversarial machine learning , 2019, AISec '11.

[40]  Victor C. M. Leung,et al.  Developing IoT applications in the Fog: A Distributed Dataflow approach , 2015, 2015 5th International Conference on the Internet of Things (IOT).

[41]  Giuseppe Ateniese,et al.  Deep Models Under the GAN: Information Leakage from Collaborative Deep Learning , 2017, CCS.

[42]  Blaise Agüera y Arcas,et al.  Communication-Efficient Learning of Deep Networks from Decentralized Data , 2016, AISTATS.

[43]  Dan Alistarh,et al.  QSGD: Communication-Optimal Stochastic Gradient Descent, with Applications to Training Neural Networks , 2016, 1610.02132.

[44]  Shai Shalev-Shwartz,et al.  SGD Learns Over-parameterized Networks that Provably Generalize on Linearly Separable Data , 2017, ICLR.

[45]  Song Guo,et al.  Experience-Driven Computational Resource Allocation of Federated Learning by Deep Reinforcement Learning , 2020, 2020 IEEE International Parallel and Distributed Processing Symposium (IPDPS).

[46]  David Lillethun,et al.  Mobile fog: a programming model for large-scale applications on the internet of things , 2013, MCC '13.

[47]  Canh Dinh,et al.  Federated Learning Over Wireless Networks: Convergence Analysis and Resource Allocation , 2019, IEEE/ACM Transactions on Networking.

[48]  Qun Li,et al.  FABA: An Algorithm for Fast Aggregation against Byzantine Attacks in Distributed Neural Networks , 2019, IJCAI.

[49]  Qun Li,et al.  A Survey of Fog Computing: Concepts, Applications and Issues , 2015, Mobidata@MobiHoc.

[50]  Ramesh Raskar,et al.  Split learning for health: Distributed deep learning without sharing raw patient data , 2018, ArXiv.

[51]  Kin K. Leung,et al.  When Edge Meets Learning: Adaptive Control for Resource-Constrained Distributed Machine Learning , 2018, IEEE INFOCOM 2018 - IEEE Conference on Computer Communications.

[52]  Stephen J. Wright,et al.  Hogwild: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent , 2011, NIPS.

[53]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[54]  Cong Xu,et al.  TernGrad: Ternary Gradients to Reduce Communication in Distributed Deep Learning , 2017, NIPS.

[55]  Kannan Ramchandran,et al.  Robust Federated Learning in a Heterogeneous Environment , 2019, ArXiv.

[56]  Ying-Chang Liang,et al.  Incentive Design for Efficient Federated Learning in Mobile Networks: A Contract Theory Approach , 2019, 2019 IEEE VTS Asia Pacific Wireless Communications Symposium (APWCS).

[57]  Wenqi Wei,et al.  Demystifying Membership Inference Attacks in Machine Learning as a Service , 2019, IEEE Transactions on Services Computing.

[58]  Kenneth T. Co,et al.  Byzantine-Robust Federated Machine Learning through Adaptive Model Averaging , 2019, ArXiv.

[59]  Song Han,et al.  Deep Leakage from Gradients , 2019, NeurIPS.

[60]  Richard Nock,et al.  Advances and Open Problems in Federated Learning , 2021, Found. Trends Mach. Learn..

[61]  Yoshua Bengio,et al.  BinaryConnect: Training Deep Neural Networks with binary weights during propagations , 2015, NIPS.

[62]  E. Modiano,et al.  Fairness and Optimal Stochastic Control for Heterogeneous Networks , 2005, IEEE/ACM Transactions on Networking.

[63]  Wazir Zada Khan,et al.  Edge computing: A survey , 2019, Future Gener. Comput. Syst..

[64]  Xin Qin,et al.  FedHealth: A Federated Transfer Learning Framework for Wearable Healthcare , 2019, IEEE Intelligent Systems.

[65]  Rich Caruana,et al.  Do Deep Nets Really Need to be Deep? , 2013, NIPS.

[66]  Haomiao Yang,et al.  Efficient and Privacy-Enhanced Federated Learning for Industrial Artificial Intelligence , 2020, IEEE Transactions on Industrial Informatics.

[67]  Huzefa Rangwala,et al.  Asynchronous Online Federated Learning for Edge Devices , 2019, ArXiv.

[68]  T. F. de Oliveira,et al.  Multivariate analysis applied in dataset of Poison Control Center of São Paulo, Brazil , 2020, Scientific Reports.

[69]  Ji Liu,et al.  DoubleSqueeze: Parallel Stochastic Gradient Descent with Double-Pass Error-Compensated Compression , 2019, ICML.

[70]  Dusit Niyato,et al.  Mobile Device Training Strategies in Federated Learning: An Evolutionary Game Approach , 2019, 2019 International Conference on Internet of Things (iThings) and IEEE Green Computing and Communications (GreenCom) and IEEE Cyber, Physical and Social Computing (CPSCom) and IEEE Smart Data (SmartData).

[71]  Prateek Mittal,et al.  Analyzing Federated Learning through an Adversarial Lens , 2018, ICML.

[72]  Ning Zhang,et al.  A Survey on Service Migration in Mobile Edge Computing , 2018, IEEE Access.

[73]  Anit Kumar Sahu,et al.  Federated Learning: Challenges, Methods, and Future Directions , 2019, IEEE Signal Processing Magazine.

[74]  Tudor Dumitras,et al.  Poison Frogs! Targeted Clean-Label Poisoning Attacks on Neural Networks , 2018, NeurIPS.

[75]  Amir Salman Avestimehr,et al.  Mitigating Byzantine Attacks in Federated Learning , 2020, ArXiv.

[76]  Sebastian U. Stich,et al.  Local SGD Converges Fast and Communicates Little , 2018, ICLR.

[77]  M. Hadi Amini,et al.  Distributed Sensing Using Smart End-User Devices: Pathway to Federated Learning for Autonomous IoT , 2019, 2019 International Conference on Computational Science and Computational Intelligence (CSCI).

[78]  Rachid Guerraoui,et al.  Asynchronous Byzantine Machine Learning ( the case of SGD ) Supplementary Material , 2022 .

[79]  Kaibin Huang,et al.  Broadband Analog Aggregation for Low-Latency Federated Edge Learning , 2018, IEEE Transactions on Wireless Communications.

[80]  Wei Li,et al.  A Dynamic Service Migration Mechanism in Edge Cognitive Computing , 2018, ACM Trans. Internet Techn..

[81]  Dong Yu,et al.  1-bit stochastic gradient descent and its application to data-parallel distributed training of speech DNNs , 2014, INTERSPEECH.

[82]  Nikko Strom,et al.  Scalable distributed DNN training using commodity GPU cloud computing , 2015, INTERSPEECH.

[83]  Mehmet Emre Gursoy,et al.  Data Poisoning Attacks Against Federated Learning Systems , 2020, ESORICS.

[84]  Dan Alistarh,et al.  Byzantine Stochastic Gradient Descent , 2018, NeurIPS.

[85]  Klaus-Robert Müller,et al.  Robust and Communication-Efficient Federated Learning From Non-i.i.d. Data , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[86]  Mahadev Satyanarayanan,et al.  The Emergence of Edge Computing , 2017, Computer.

[87]  Tong Zhang,et al.  Accelerating Stochastic Gradient Descent using Predictive Variance Reduction , 2013, NIPS.

[88]  Ying-Chang Liang,et al.  Joint Service Pricing and Cooperative Relay Communication for Federated Learning , 2018, 2019 International Conference on Internet of Things (iThings) and IEEE Green Computing and Communications (GreenCom) and IEEE Cyber, Physical and Social Computing (CPSCom) and IEEE Smart Data (SmartData).

[89]  Masahiro Morikura,et al.  Hybrid-FL for Wireless Networks: Cooperative Learning Mechanism Using Non-IID Data , 2019, ICC 2020 - 2020 IEEE International Conference on Communications (ICC).

[90]  Deniz Gündüz,et al.  Hierarchical Federated Learning ACROSS Heterogeneous Cellular Networks , 2019, ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[91]  Sam Ade Jacobs,et al.  Communication Quantization for Data-Parallel Training of Deep Neural Networks , 2016, 2016 2nd Workshop on Machine Learning in HPC Environments (MLHPC).

[92]  Shiho Moriai,et al.  Privacy-Preserving Deep Learning via Additively Homomorphic Encryption , 2018, IEEE Transactions on Information Forensics and Security.

[93]  Chandra Thapa,et al.  SplitFed: When Federated Learning Meets Split Learning , 2020, ArXiv.

[94]  Gaurav Kapoor,et al.  Protection Against Reconstruction and Its Applications in Private Federated Learning , 2018, ArXiv.

[95]  Ji Liu,et al.  Gradient Sparsification for Communication-Efficient Distributed Optimization , 2017, NeurIPS.

[96]  Badih Ghazi,et al.  Scalable and Differentially Private Distributed Aggregation in the Shuffled Model , 2019, ArXiv.

[97]  Mohsen Guizani,et al.  Reliable Federated Learning for Mobile Networks , 2019, IEEE Wireless Communications.

[98]  Wei Zhang,et al.  AdaComp : Adaptive Residual Gradient Compression for Data-Parallel Distributed Training , 2017, AAAI.

[99]  Indranil Gupta,et al.  Generalized Byzantine-tolerant SGD , 2018, ArXiv.

[100]  Vitaly Shmatikov,et al.  Membership Inference Attacks Against Machine Learning Models , 2016, 2017 IEEE Symposium on Security and Privacy (SP).

[101]  Haomiao Yang,et al.  Towards Efficient and Privacy-Preserving Federated Deep Learning , 2019, ICC 2019 - 2019 IEEE International Conference on Communications (ICC).

[102]  Brendan Dolan-Gavitt,et al.  BadNets: Identifying Vulnerabilities in the Machine Learning Model Supply Chain , 2017, ArXiv.

[103]  Rui Zhang,et al.  A Hybrid Approach to Privacy-Preserving Federated Learning , 2018, Informatik Spektrum.

[104]  Yang Song,et al.  Beyond Inferring Class Representatives: User-Level Privacy Leakage From Federated Learning , 2018, IEEE INFOCOM 2019 - IEEE Conference on Computer Communications.

[105]  Raffaele Giaffreda,et al.  Edge computing in IoT context: Horizontal and vertical Linux container migration , 2017, 2017 Global Internet of Things Summit (GIoTS).

[106]  Xiaoyan Sun,et al.  Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[107]  Eryk Dutkiewicz,et al.  Energy Demand Prediction with Federated Learning for Electric Vehicle Networks , 2019, 2019 IEEE Global Communications Conference (GLOBECOM).

[108]  Jie Xu,et al.  Federated Learning for Healthcare Informatics , 2019, ArXiv.

[109]  Takayuki Nishio,et al.  Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge , 2018, ICC 2019 - 2019 IEEE International Conference on Communications (ICC).

[110]  Deniz Gündüz,et al.  Energy-Aware Analog Aggregation for Federated Learning with Redundant Data , 2019, ICC 2020 - 2020 IEEE International Conference on Communications (ICC).

[111]  Qun Li,et al.  eSGD: Communication Efficient Distributed Deep Learning on the Edge , 2018, HotEdge.

[112]  Vitaly Shmatikov,et al.  Exploiting Unintended Feature Leakage in Collaborative Learning , 2018, 2019 IEEE Symposium on Security and Privacy (SP).

[113]  Wei Pan,et al.  Towards Accurate Binary Convolutional Neural Network , 2017, NIPS.

[114]  Lili Su,et al.  Distributed Statistical Machine Learning in Adversarial Settings , 2017, Proc. ACM Meas. Anal. Comput. Syst..