Channel-Adaptive Quantization for Wireless Federated Learning

As a popular distributed machine learning based on stochastic gradient decent (SGD), federated learning enables edge devices to compute stochastic gradients and then upload them to an edge server for global AI-model updating. However, there are two problems to be faced in federated learning. First, the transmission of high -dimensional stochastic gradients from edge devices to edge server causes serious communication bottleneck. Second, the fading and noise of wireless channels may distort the transmitted stochastic gradients. To address these issues, we adopt scaler quantization for SGD by taking dynamics of wireless channels into account and analyze the influence of wireless channels and quantization on the convergence rate. Then, the quantization error and channel effects are characterized in received signal-to-noise ratio (SNR). We formulate and then solve an optimization problem of sum-SNR maximization by quantization resources allocation, for the goal of making the quantization and transmission as reliable as possible. Simulations show that our scheme can outperform the benchmark schemes.