暂无分享,去创建一个
Marco Canini | Ahmed M. Abdelmoniem | Chen-Yu Ho | Pantelis Papageorgiou | Muhammad Bilal | M. Canini | A. Abdelmoniem | Chen-Yu Ho | Pantelis Papageorgiou | Muhammad Bilal
[1] Ian Goodfellow,et al. Deep Learning with Differential Privacy , 2016, CCS.
[2] Aryan Mokhtari,et al. FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization , 2019, AISTATS.
[3] Seunghak Lee,et al. More Effective Distributed ML via a Stale Synchronous Parallel Parameter Server , 2013, NIPS.
[4] R. Jain. Throughput fairness index : An explanation , 1999 .
[5] Sreeram Kannan,et al. Improving Federated Learning Personalization via Model Agnostic Meta Learning , 2019, ArXiv.
[6] Blaise Agüera y Arcas,et al. Communication-Efficient Learning of Deep Networks from Decentralized Data , 2016, AISTATS.
[7] Gregory Cohen,et al. EMNIST: Extending MNIST to handwritten letters , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).
[8] Anit Kumar Sahu,et al. Federated Optimization in Heterogeneous Networks , 2018, MLSys.
[9] Sebastian Caldas,et al. LEAF: A Benchmark for Federated Settings , 2018, ArXiv.
[10] P. Cochat,et al. Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.
[11] Anit Kumar Sahu,et al. Federated Learning: Challenges, Methods, and Future Directions , 2019, IEEE Signal Processing Magazine.
[12] Yibo Zhu,et al. A generic communication scheduler for distributed DNN training acceleration , 2019, SOSP.
[13] Xu Sun,et al. Adaptive Gradient Methods with Dynamic Bound of Learning Rate , 2019, ICLR.
[14] Xiaogang Wang,et al. Deep Learning Face Attributes in the Wild , 2014, 2015 IEEE International Conference on Computer Vision (ICCV).
[15] Kevin Leyton-Brown,et al. An Efficient Approach for Assessing Hyperparameter Importance , 2014, ICML.
[16] Dario Rossi,et al. Heterogeneous Data-Aware Federated Learning , 2020, ArXiv.
[17] Jiawei Jiang,et al. Heterogeneity-aware Distributed Parameter Servers , 2017, SIGMOD Conference.
[18] Daniel Rueckert,et al. A generic framework for privacy preserving deep learning , 2018, ArXiv.
[19] Ligang He,et al. Accelerating Federated Learning Over Reliability-Agnostic Clients in Mobile Edge Computing Systems , 2020, IEEE Transactions on Parallel and Distributed Systems.
[20] Hao Wang,et al. Optimizing Federated Learning on Non-IID Data with Reinforcement Learning , 2020, IEEE INFOCOM 2020 - IEEE Conference on Computer Communications.
[21] Mehryar Mohri,et al. Agnostic Federated Learning , 2019, ICML.
[22] A. Krishnamurthy,et al. PLINK: DISCOVERING AND EXPLOITING DATACENTER NETWORK LOCALITY FOR EFFICIENT CLOUD-BASED DISTRIBUTED TRAINING , 2020 .
[23] Wei Zhang,et al. AdaComp : Adaptive Residual Gradient Compression for Data-Parallel Distributed Training , 2017, AAAI.
[24] Samy Bengio,et al. Revisiting Distributed Synchronous SGD , 2016, ArXiv.
[25] H. Brendan McMahan,et al. Learning Differentially Private Recurrent Language Models , 2017, ICLR.
[26] Tian Li,et al. Fair Resource Allocation in Federated Learning , 2019, ICLR.
[27] Sarvar Patel,et al. Practical Secure Aggregation for Privacy-Preserving Machine Learning , 2017, IACR Cryptol. ePrint Arch..
[28] Alexander Sergeev,et al. Horovod: fast and easy distributed deep learning in TensorFlow , 2018, ArXiv.
[29] Vitaly Shmatikov,et al. Exploiting Unintended Feature Leakage in Collaborative Learning , 2018, 2019 IEEE Symposium on Security and Privacy (SP).
[30] Xin Yuan,et al. Bandwidth optimal all-reduce algorithms for clusters of workstations , 2009, J. Parallel Distributed Comput..
[31] Hubert Eichner,et al. Federated Learning for Mobile Keyboard Prediction , 2018, ArXiv.
[32] Hubert Eichner,et al. Towards Federated Learning at Scale: System Design , 2019, SysML.
[33] Amir Houmansadr,et al. Comprehensive Privacy Analysis of Deep Learning: Passive and Active White-box Inference Attacks against Centralized and Federated Learning , 2018, 2019 IEEE Symposium on Security and Privacy (SP).
[34] Tianjian Chen,et al. Federated Machine Learning: Concept and Applications , 2019 .
[35] Hubert Eichner,et al. APPLIED FEDERATED LEARNING: IMPROVING GOOGLE KEYBOARD QUERY SUGGESTIONS , 2018, ArXiv.
[36] Vitaly Shmatikov,et al. How To Backdoor Federated Learning , 2018, AISTATS.
[37] Xiaoyan Sun,et al. Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation , 2019, IEEE Transactions on Neural Networks and Learning Systems.
[38] Takayuki Nishio,et al. Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge , 2018, ICC 2019 - 2019 IEEE International Conference on Communications (ICC).
[39] Carlo Luschi,et al. Revisiting Small Batch Training for Deep Neural Networks , 2018, ArXiv.
[40] Alexander J. Smola,et al. Scaling Distributed Machine Learning with the Parameter Server , 2014, OSDI.
[41] J. I. The Design of Experiments , 1936, Nature.
[42] Jun Wang,et al. SmartPC: Hierarchical Pace Control in Real-Time Federated Learning System , 2019, 2019 IEEE Real-Time Systems Symposium (RTSS).
[43] Niranjan A. Subrahmanya,et al. Training Keyword Spotting Models on Non-IID Data with Federated Learning , 2020, INTERSPEECH.
[44] Ameet Talwalkar,et al. Federated Multi-Task Learning , 2017, NIPS.
[45] Peter Richtárik,et al. Federated Learning: Strategies for Improving Communication Efficiency , 2016, ArXiv.
[46] Yi Zhou,et al. Towards Taming the Resource and Data Heterogeneity in Federated Learning , 2019, OpML.
[47] Aaas News,et al. Book Reviews , 1893, Buffalo Medical and Surgical Journal.
[48] Pan Hui,et al. Privacy-Preserving Asynchronous Federated Learning Mechanism for Edge Network Computing , 2020, IEEE Access.
[49] Klaus-Robert Müller,et al. Robust and Communication-Efficient Federated Learning From Non-i.i.d. Data , 2019, IEEE Transactions on Neural Networks and Learning Systems.
[50] Stephen J. Wright,et al. Hogwild: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent , 2011, NIPS.