暂无分享,去创建一个
Mehdi Bennis | Seong-Lyun Kim | Jihong Park | Hyesung Kim | Seungeun Oh | Eunjeong Jeong | M. Bennis | Seong-Lyun Kim | Jihong Park | Seungeun Oh | Hyesung Kim | Eunjeong Jeong
[1] Mehdi Bennis,et al. On-Device Federated Learning via Blockchain and its Latency Analysis , 2018, ArXiv.
[2] Geoffrey E. Hinton,et al. Large scale distributed neural network training through online distillation , 2018, ICLR.
[3] Takayuki Nishio,et al. Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge , 2018, ICC 2019 - 2019 IEEE International Conference on Communications (ICC).
[4] Tassilo Klein,et al. Differentially Private Federated Learning: A Client Level Perspective , 2017, ArXiv.
[5] Yue Zhao,et al. Federated Learning with Non-IID Data , 2018, ArXiv.
[6] Walid Saad,et al. Distributed Federated Learning for Ultra-Reliable Low-Latency Vehicular Communications , 2018, IEEE Transactions on Communications.
[7] Blaise Agüera y Arcas,et al. Communication-Efficient Learning of Deep Networks from Decentralized Data , 2016, AISTATS.
[8] Simon Osindero,et al. Conditional Generative Adversarial Nets , 2014, ArXiv.
[9] Seong-Lyun Kim,et al. Tractable Resource Management With Uplink Decoupled Millimeter-Wave Overlay in Ultra-Dense Cellular Networks , 2015, IEEE Transactions on Wireless Communications.
[10] Geoffrey E. Hinton,et al. Distilling the Knowledge in a Neural Network , 2015, ArXiv.
[11] Peter Richtárik,et al. Federated Optimization: Distributed Machine Learning for On-Device Intelligence , 2016, ArXiv.