Confederated Learning: Federated Learning With Decentralized Edge Servers
暂无分享,去创建一个
Qing Ling | Jun Fang | Xiaojun Yuan | Bin Wang | Hongbin Li
[1] Ermin Wei,et al. FedHybrid: A Hybrid Federated Optimization Method for Heterogeneous Clients , 2023, IEEE Transactions on Signal Processing.
[2] Le Liang,et al. Decentralized Federated Learning With Unreliable Communications , 2021, IEEE Journal of Selected Topics in Signal Processing.
[3] Wei Liu,et al. Decentralized Federated Learning: Balancing Communication and Computing Costs , 2021, IEEE Transactions on Signal and Information Processing over Networks.
[4] Shiqiang Wang,et al. Demystifying Why Local Aggregation Helps: Convergence Analysis of Hierarchical SGD , 2020, AAAI.
[5] Kathe P. Fox,et al. Confederated learning in healthcare: Training machine learning models using disconnected data separated by individual, data type and identity for Large-Scale health system Intelligence , 2020, J. Biomed. Informatics.
[6] Geoffrey Y. Li,et al. Communication-Efficient ADMM-based Federated Learning , 2021, ArXiv.
[7] Ermin Wei,et al. FedHybrid: A Hybrid Primal-Dual Algorithm Framework for Federated Optimization , 2021, 2106.01279.
[8] Andreas Keller,et al. Swarm Learning for decentralized and confidential clinical machine learning , 2021, Nature.
[9] Aruna Seneviratne,et al. Federated Learning for Internet of Things: A Comprehensive Survey , 2021, IEEE Communications Surveys & Tutorials.
[10] Wotao Yin,et al. FedPD: A Federated Learning Framework With Adaptivity to Non-IID Data , 2020, IEEE Transactions on Signal Processing.
[11] Gauri Joshi,et al. Cooperative SGD: A Unified Framework for the Design and Analysis of Local-Update SGD Algorithms , 2021, J. Mach. Learn. Res..
[12] Lam M. Nguyen,et al. Federated Learning with Randomized Douglas-Rachford Splitting Methods , 2021, ArXiv.
[13] Rong-Rong Chen,et al. Local Averaging Helps: Hierarchical Federated Learning and Convergence Analysis , 2020, ArXiv.
[14] Tengyu Ma,et al. Federated Accelerated Stochastic Gradient Descent , 2020, NeurIPS.
[15] Martin J. Wainwright,et al. FedSplit: An algorithmic framework for fast federated optimization , 2020, NeurIPS.
[16] Osvaldo Simeone,et al. Decentralized Federated Learning via SGD over Wireless D2D Networks , 2020, 2020 IEEE 21st International Workshop on Signal Processing Advances in Wireless Communications (SPAWC).
[17] Monica Nicoli,et al. Federated Learning With Cooperating Devices: A Consensus Approach for Massive IoT Networks , 2019, IEEE Internet of Things Journal.
[18] U. Khan,et al. Variance-Reduced Decentralized Stochastic Optimization With Accelerated Convergence , 2019, IEEE Transactions on Signal Processing.
[19] Zhisheng Niu,et al. Device Scheduling with Fast Convergence for Wireless Federated Learning , 2019, ICC 2020 - 2020 IEEE International Conference on Communications (ICC).
[20] Sashank J. Reddi,et al. SCAFFOLD: Stochastic Controlled Averaging for Federated Learning , 2019, ICML.
[21] Li Chen,et al. Accelerating Federated Learning via Momentum Gradient Descent , 2019, IEEE Transactions on Parallel and Distributed Systems.
[22] H. Vincent Poor,et al. Scheduling Policies for Federated Learning in Wireless Networks , 2019, IEEE Transactions on Communications.
[23] Xiang Li,et al. On the Convergence of FedAvg on Non-IID Data , 2019, ICLR.
[24] Zhi Ding,et al. Federated Learning via Over-the-Air Computation , 2018, IEEE Transactions on Wireless Communications.
[25] Anit Kumar Sahu,et al. Federated Optimization in Heterogeneous Networks , 2018, MLSys.
[26] Yi Zhou,et al. Communication-efficient algorithms for decentralized and stochastic optimization , 2017, Mathematical Programming.
[27] M. Mahdavi,et al. On the Convergence of Local Descent Methods in Federated Learning , 2019, ArXiv.
[28] Farzin Haddadpour,et al. Local SGD with Periodic Averaging: Tighter Analysis and Adaptive Synchronization , 2019, NeurIPS.
[29] Enhong Chen,et al. Variance Reduced Local SGD with Lower Communication Complexity , 2019, ArXiv.
[30] Kenneth D. Mandl,et al. Confederated Machine Learning on Horizontally and Vertically Separated Medical Data for Large-Scale Health System Intelligence , 2019, ArXiv.
[31] István Hegedüs,et al. Gossip Learning as a Decentralized Alternative to Federated Learning , 2019, DAIS.
[32] Martin Jaggi,et al. Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication , 2019, ICML.
[33] Tara Javidi,et al. Peer-to-peer Federated Learning on Graphs , 2019, ArXiv.
[34] Anit Kumar Sahu,et al. On the Convergence of Federated Optimization in Heterogeneous Networks , 2018, ArXiv.
[35] G. Giannakis,et al. Hybrid ADMM: a unifying and fast approach to decentralized optimization , 2018, EURASIP J. Adv. Signal Process..
[36] Jun Fang,et al. A Proximal ADMM for Decentralized Composite Optimization , 2018, IEEE Signal Processing Letters.
[37] Wei Shi,et al. Federated learning of predictive models from federated Electronic Health Records , 2018, Int. J. Medical Informatics.
[38] Chinmay Hegde,et al. Collaborative Deep Learning in Fixed Topology Networks , 2017, NIPS.
[39] Blaise Agüera y Arcas,et al. Communication-Efficient Learning of Deep Networks from Decentralized Data , 2016, AISTATS.
[40] Peter Richtárik,et al. Federated Learning: Strategies for Improving Communication Efficiency , 2016, ArXiv.
[41] Wei Shi,et al. EXTRA: An Exact First-Order Algorithm for Decentralized Consensus Optimization , 2014, SIAM J. Optim..
[42] Francis Bach,et al. SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives , 2014, NIPS.
[43] Qing Ling,et al. On the Linear Convergence of the ADMM in Decentralized Consensus Optimization , 2013, IEEE Transactions on Signal Processing.
[44] Bingsheng He,et al. On the O(1/n) Convergence Rate of the Douglas-Rachford Alternating Direction Method , 2012, SIAM J. Numer. Anal..