Multi-Stage Asynchronous Federated Learning With Adaptive Differential Privacy
暂无分享,去创建一个
Cong Zhao | Xuebin Ren | Yanan Li | Shusen Yang | Liang Shi
[1] Semeen Rehman,et al. Reliable and Resilient AI and IoT-based Personalised Healthcare Services: A Survey , 2022, IEEE Access.
[2] Sebastian U. Stich,et al. Sharper Convergence Guarantees for Asynchronous SGD for Distributed and Federated Learning , 2022, NeurIPS.
[3] Jingren Zhou,et al. FederatedScope: A Flexible Federated Learning Platform for Heterogeneity , 2022, Proc. VLDB Endow..
[4] Hamid Reza Feyzmahdavian,et al. Delay-adaptive step-sizes for asynchronous learning , 2022, ICML.
[5] Xiyao Ma,et al. Beyond Class-Level Privacy Leakage: Breaking Record-Level Privacy in Federated Learning , 2022, IEEE Internet of Things Journal.
[6] Prateek Mittal,et al. SparseFed: Mitigating Model Poisoning Attacks in Federated Learning with Sparsification , 2021, AISTATS.
[7] Xintao Wu,et al. Removing Disparate Impact on Model Accuracy in Differentially Private Stochastic Gradient Descent , 2021, KDD.
[8] D. Rueckert,et al. Medical imaging deep learning with differential privacy , 2021, Scientific Reports.
[9] Assaf Schuster,et al. Learning Under Delayed Feedback: Implicitly Adapting to Gradient Delays , 2021, ArXiv.
[10] Amit Daniely,et al. Asynchronous Stochastic Optimization Robust to Arbitrary Delays , 2021, NeurIPS.
[11] Virginia Smith,et al. On Large-Cohort Training for Federated Learning , 2021, NeurIPS.
[12] H. Vincent Poor,et al. User-Level Privacy-Preserving Federated Learning: Analysis and Performance Optimization , 2021, IEEE Transactions on Mobile Computing.
[13] Emiliano De Cristofaro,et al. Local and Central Differential Privacy for Robustness and Privacy in Federated Learning , 2020, NDSS.
[14] Kartik Sreenivasan,et al. Attack of the Tails: Yes, You Really Can Backdoor Federated Learning , 2020, NeurIPS.
[15] Zhiwei Steven Wu,et al. Understanding Gradient Clipping in Private SGD: A Geometric Perspective , 2020, NeurIPS.
[16] Ananda Theertha Suresh,et al. Can You Really Backdoor Federated Learning? , 2019, ArXiv.
[17] H. Vincent Poor,et al. Federated Learning With Differential Privacy: Algorithms and Performance Analysis , 2019, IEEE Transactions on Information Forensics and Security.
[18] Ziye Zhou,et al. Measure Contribution of Participants in Federated Learning , 2019, 2019 IEEE International Conference on Big Data (Big Data).
[19] Anit Kumar Sahu,et al. Federated Learning: Challenges, Methods, and Future Directions , 2019, IEEE Signal Processing Magazine.
[20] Sashank J. Reddi,et al. AdaCliP: Adaptive Clipping for Private SGD , 2019, ArXiv.
[21] Suvrit Sra,et al. Why Gradient Clipping Accelerates Training: A Theoretical Justification for Adaptivity , 2019, ICLR.
[22] H. B. McMahan,et al. Differentially Private Learning with Adaptive Clipping , 2019, NeurIPS.
[23] Rui Zhang,et al. A Hybrid Approach to Privacy-Preserving Federated Learning , 2018, Informatik Spektrum.
[24] Peter Bloem,et al. Three Tools for Practical Differential Privacy , 2018, ArXiv.
[25] Yang Song,et al. Beyond Inferring Class Representatives: User-Level Privacy Leakage From Federated Learning , 2018, IEEE INFOCOM 2019 - IEEE Conference on Computer Communications.
[26] Sebastian Caldas,et al. LEAF: A Benchmark for Federated Settings , 2018, ArXiv.
[27] Daniel Soudry,et al. Post training 4-bit quantization of convolutional networks for rapid-deployment , 2018, NeurIPS.
[28] Borja Balle,et al. Privacy Amplification by Subsampling: Tight Analyses via Couplings and Divergences , 2018, NeurIPS.
[29] Sanjiv Kumar,et al. cpSGD: Communication-efficient and differentially-private distributed SGD , 2018, NeurIPS.
[30] Shiho Moriai,et al. Privacy-Preserving Deep Learning via Additively Homomorphic Encryption , 2018, IEEE Transactions on Information Forensics and Security.
[31] Kin K. Leung,et al. Adaptive Federated Learning in Resource Constrained Edge Computing Systems , 2018, IEEE Journal on Selected Areas in Communications.
[32] Parijat Dube,et al. Slow and Stale Gradients Can Win the Race , 2018, IEEE Journal on Selected Areas in Information Theory.
[33] Tassilo Klein,et al. Differentially Private Federated Learning: A Client Level Perspective , 2017, ArXiv.
[34] H. Brendan McMahan,et al. Learning Differentially Private Recurrent Language Models , 2017, ICLR.
[35] Wotao Yin,et al. More Iterations per Second, Same Quality - Why Asynchronous Algorithms may Drastically Outperform Traditional Ones , 2017, ArXiv.
[36] Wotao Yin,et al. Asynchronous Coordinate Descent under More Realistic Assumptions , 2017, NIPS.
[37] Giuseppe Ateniese,et al. Deep Models Under the GAN: Information Leakage from Collaborative Deep Learning , 2017, CCS.
[38] Nenghai Yu,et al. Asynchronous Stochastic Gradient Descent with Delay Compensation , 2016, ICML.
[39] J. Morris Chang,et al. Reconstruction Attacks Against Mobile-Based Continuous Authentication Systems in the Cloud , 2016, IEEE Transactions on Information Forensics and Security.
[40] Ian Goodfellow,et al. Deep Learning with Differential Privacy , 2016, CCS.
[41] Blaise Agüera y Arcas,et al. Communication-Efficient Learning of Deep Networks from Decentralized Data , 2016, AISTATS.
[42] Somesh Jha,et al. Model Inversion Attacks that Exploit Confidence Information and Basic Countermeasures , 2015, CCS.
[43] Vitaly Shmatikov,et al. Privacy-preserving deep learning , 2015, 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[44] Michael I. Jordan,et al. Perturbed Iterate Analysis for Asynchronous Stochastic Optimization , 2015, SIAM J. Optim..
[45] James T. Kwok,et al. Asynchronous Distributed ADMM for Consensus Optimization , 2014, ICML.
[46] Sébastien Bubeck. Convex Optimization: Algorithms and Complexity , 2014, Found. Trends Mach. Learn..
[47] Stephen J. Wright,et al. Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties , 2014, SIAM J. Optim..
[48] Marc'Aurelio Ranzato,et al. Large Scale Distributed Deep Networks , 2012, NIPS.
[49] Stephen J. Wright,et al. Hogwild: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent , 2011, NIPS.
[50] Cynthia Dwork,et al. Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.
[51] Jerry Avorn. Technology , 1929, Nature.
[52] Hongli Xu,et al. FedSA: A Semi-Asynchronous Federated Learning Mechanism in Heterogeneous Edge Computing , 2021, IEEE Journal on Selected Areas in Communications.
[53] CRFL: Certifiably Robust Federated Learning against Backdoor Attacks , 2021 .
[54] Yang Liu,et al. BatchCrypt: Efficient Homomorphic Encryption for Cross-Silo Federated Learning , 2020, USENIX ATC.