Recent years have witnessed the rapid growth of smart devices and mobile applications. However, mobile applications are typically computation-intensive and delay-sensitive, while User Devices (UDs) are usually resource-limited. Mobile Edge Computing (MEC) has been proposed as a promising paradigm to mitigate the tension, where UDs’ tasks could be executed either locally on itself or remotely on the edge server via computation offloading. Lots of efficient computation offloading scheduling approaches have been proposed, whereas most of them are based on centralized scheduling which could face troubles in large-scale MEC. To address the issue, this paper proposes a distributed scheduling framework by leveraging the idea of ‘centralized training and distributed scheduling’. Furthermore, the Actor-Critic reinforcement learning is adopted to build the framework where the Actor and Critic play the roles of distributed scheduling and centralized training, respectively. Extensive simulations are conducted and the experimental results verify the effectiveness and efficiency of the proposed framework.