Online Client Scheduling for Fast Federated Learning

Federated learning (FL) enables clients to collaboratively learn a shared task while keeping data privacy, which can be adopted at the edge of wireless networks to improve edge intelligence. In this letter, we aim to minimize the training latency of a wireless FL system for a given training loss by client scheduling. Instead of assuming that the prior information about wireless channel state and local computing power of the clients is available, we consider a more practical scenario without knowing the prior information. We first reformulate the client scheduling problem as a multi-armed bandit program and then propose an online scheduling scheme based on $\epsilon $ -greedy algorithm to achieve a tradeoff between exploration and exploitation. In addition, the proposed client scheduling scheme reduces the number of training rounds and the time interval per round simultaneously by jointly considering the significance of local updates and delay issue of each client. Simulation results show that in the case of non-independent and identically distributed data, the proposed scheme can save half the training time compared to the scheme which only considers the significance of local updates, and can improve more than 20% test accuracy compared to the scheme which only considers the time consumption per round of each client.