Reinforcement Learning-Based Mobile Offloading for Edge Computing Against Jamming and Interference

Mobile edge computing systems help improve the performance of computational-intensive applications on mobile devices and have to resist jamming attacks and heavy interference. In this paper, we present a reinforcement learning based mobile offloading scheme for edge computing against jamming attacks and interference, which uses safe reinforcement learning to avoid choosing the risky offloading policy that fails to meet the computational latency requirements of the tasks. This scheme enables the mobile device to choose the edge device, the transmit power and the offloading rate to improve its utility including the sharing gain, the computational latency, the energy consumption and the signal-to-interference-plus-noise ratio of the offloading signals without knowing the task generation model, the edge computing model, and the jamming/interference model. We also design a deep reinforcement learning based mobile offloading for edge computing that uses an actor network to choose the offloading policy and a critic network to update the actor network weights to improve the computational performance. We discuss the computational complexity and provide the performance bound that consists of the computational latency and the energy consumption based on the Nash equilibrium of the mobile offloading game. Simulation results show that this scheme can reduce the computational latency and save energy consumption.

[1]  Zibin Zheng,et al.  Online Deep Reinforcement Learning for Computation Offloading in Blockchain-Empowered Mobile Edge Computing , 2019, IEEE Transactions on Vehicular Technology.

[2]  Ning Zhang,et al.  Joint Admission Control and Resource Allocation in Edge Computing for Internet of Things , 2018, IEEE Network.

[3]  Michael I. Jordan,et al.  Is Q-learning Provably Efficient? , 2018, NeurIPS.

[4]  Bhaskar Krishnamachari,et al.  Hermes: Latency Optimal Task Assignment for Resource-constrained Mobile Computing , 2017, IEEE Transactions on Mobile Computing.

[5]  Zhetao Li,et al.  Energy-Efficient Dynamic Computation Offloading and Cooperative Task Scheduling in Mobile Cloud Computing , 2019, IEEE Transactions on Mobile Computing.

[6]  Yan Zhang,et al.  Energy-efficient workload offloading and power control in vehicular edge computing , 2018, 2018 IEEE Wireless Communications and Networking Conference Workshops (WCNCW).

[7]  Xiang Chen,et al.  Security in Mobile Edge Caching with Reinforcement Learning , 2018, IEEE Wireless Communications.

[8]  Jian Sun,et al.  Convolutional neural networks at constrained time cost , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[9]  Bo Li,et al.  Gearing resource-poor mobile devices with powerful clouds: architectures, challenges, and applications , 2013, IEEE Wireless Communications.

[10]  Geyong Min,et al.  Computation Offloading in Multi-Access Edge Computing Using a Deep Sequential Model Based on Reinforcement Learning , 2019, IEEE Communications Magazine.

[11]  Xiangjie Kong,et al.  A Cooperative Partial Computation Offloading Scheme for Mobile Edge Computing Enabled Internet of Things , 2019, IEEE Internet of Things Journal.

[12]  Wenzhong Li,et al.  Efficient Multi-User Computation Offloading for Mobile-Edge Cloud Computing , 2015, IEEE/ACM Transactions on Networking.

[13]  Kaibin Huang,et al.  Energy-Efficient Resource Allocation for Mobile-Edge Computation Offloading , 2016, IEEE Transactions on Wireless Communications.

[14]  Jaime Llorca,et al.  Speeding Up Future Video Distribution via Channel-Aware Caching-Aided Coded Multicast , 2016, IEEE Journal on Selected Areas in Communications.

[15]  Jukka K. Nurminen,et al.  Energy Efficiency of Mobile Clients in Cloud Computing , 2010, HotCloud.

[16]  Shuguang Cui,et al.  Joint offloading and computing optimization in wireless powered mobile-edge computing systems , 2017, 2017 IEEE International Conference on Communications (ICC).

[17]  Jingyu Wang,et al.  Knowledge-Driven Service Offloading Decision for Vehicular Edge Computing: A Deep Reinforcement Learning Approach , 2019, IEEE Transactions on Vehicular Technology.

[18]  Haiyun Luo,et al.  Energy-Optimal Mobile Cloud Computing under Stochastic Wireless Channel , 2013, IEEE Transactions on Wireless Communications.

[19]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[20]  Weihua Zhuang,et al.  Learning-Based Computation Offloading for IoT Devices With Energy Harvesting , 2017, IEEE Transactions on Vehicular Technology.

[21]  Qianbin Chen,et al.  Joint Computation Offloading and Interference Management in Wireless Cellular Networks with Mobile Edge Computing , 2017, IEEE Transactions on Vehicular Technology.

[22]  H. Vincent Poor,et al.  Dynamic Task Offloading and Resource Allocation for Ultra-Reliable Low-Latency Edge Computing , 2018, IEEE Transactions on Communications.

[23]  Mehdi Bennis,et al.  Optimized Computation Offloading Performance in Virtual Edge Computing Systems Via Deep Reinforcement Learning , 2018, IEEE Internet of Things Journal.

[24]  K. B. Letaief,et al.  A Survey on Mobile Edge Computing: The Communication Perspective , 2017, IEEE Communications Surveys & Tutorials.

[25]  Liang Xiao,et al.  Cloud-Based Malware Detection Game for Mobile Devices with Offloading , 2017, IEEE Transactions on Mobile Computing.

[26]  Dario Pompili,et al.  Joint Task Offloading and Resource Allocation for Multi-Server Mobile-Edge Computing Networks , 2017, IEEE Transactions on Vehicular Technology.

[27]  Alexandros G. Dimakis,et al.  Femtocaching and device-to-device collaboration: A new architecture for wireless video distribution , 2012, IEEE Communications Magazine.

[28]  Wei Ni,et al.  Stochastic Online Learning for Mobile Edge Computing: Learning from Changes , 2019, IEEE Communications Magazine.

[29]  Xianfu Chen,et al.  Energy-Efficiency Oriented Traffic Offloading in Wireless Networks: A Brief Survey and a Learning Approach for Heterogeneous Cellular Networks , 2015, IEEE Journal on Selected Areas in Communications.

[30]  Victor C. M. Leung,et al.  Software-Defined Networks with Mobile Edge Computing and Caching for Smart Cities: A Big Data Deep Reinforcement Learning Approach , 2017, IEEE Communications Magazine.

[31]  Dipankar Raychaudhuri,et al.  SEGUE: Quality of Service Aware Edge Cloud Service Migration , 2016, 2016 IEEE International Conference on Cloud Computing Technology and Science (CloudCom).

[32]  Jiannong Cao,et al.  AppBooster: Boosting the Performance of Interactive Mobile Applications with Computation Offloading and Parameter Tuning , 2017, IEEE Transactions on Parallel and Distributed Systems.

[33]  Min Chen,et al.  Data-Driven Computing and Caching in 5G Networks: Architecture and Delay Analysis , 2018, IEEE Wireless Communications.

[34]  Xiaojiang Du,et al.  Reinforcement Learning Based Mobile Offloading for Cloud-Based Malware Detection , 2017, GLOBECOM 2017 - 2017 IEEE Global Communications Conference.

[35]  Tony Q. S. Quek,et al.  Offloading in Mobile Edge Computing: Task Allocation and Computational Frequency Scaling , 2017, IEEE Transactions on Communications.

[36]  Yuval Tassa,et al.  Continuous control with deep reinforcement learning , 2015, ICLR.

[37]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[38]  Qianbin Chen,et al.  Computation Offloading and Resource Allocation in Wireless Cellular Networks With Mobile Edge Computing , 2017, IEEE Transactions on Wireless Communications.

[39]  Miroslav Krstic,et al.  Extremum Seeking for Static Maps With Delays , 2017, IEEE Transactions on Automatic Control.

[40]  Rose Qingyang Hu,et al.  Mobility-Aware Edge Caching and Computing in Vehicle Networks: A Deep Reinforcement Learning , 2018, IEEE Transactions on Vehicular Technology.