Mobile Traffic Offloading with Forecasting using Deep Reinforcement Learning

With the explosive growth in demand for mobile traffic, one of the promising solutions is to offload cellular traffic to small base stations for better system efficiency. Due to increasing system complexity, network operators are facing severe challenges and looking for machine learning-based solutions. In this work, we propose an energy-aware mobile traffic offloading scheme in the heterogeneous network jointly apply deep Q network (DQN) decision making and advanced traffic demand forecasting. The base station control model is trained and verified on an open dataset from a major telecom operator. The performance evaluation shows that DQN-based methods outperform others at all levels of mobile traffic demand. Also, the advantage of accurate traffic prediction is more significant under higher traffic demand.

[1]  Jamil Salem Barbar,et al.  Computer network traffic prediction: a comparison between traditional and deep learning neural networks , 2015, Int. J. Big Data Intell..

[2]  Xuemin Shen,et al.  Cloud assisted HetNets toward 5G wireless networks , 2015, IEEE Communications Magazine.

[3]  Marco Pavone,et al.  Cellular Network Traffic Scheduling With Deep Reinforcement Learning , 2018, AAAI.

[4]  Balasubramaniam Natarajan,et al.  Small Cell Base Station Sleep Strategies for Energy Efficiency , 2016, IEEE Transactions on Vehicular Technology.

[5]  Tijani Chahed,et al.  Optimal Control of Wake Up Mechanisms of Femtocells in Heterogeneous Networks , 2012, IEEE Journal on Selected Areas in Communications.

[6]  Jinsong Wu,et al.  Survey of Strategies for Switching Off Base Stations in Heterogeneous Networks for Greener 5G Systems , 2016, IEEE Access.

[7]  Xianfu Chen,et al.  Energy-Efficiency Oriented Traffic Offloading in Wireless Networks: A Brief Survey and a Learning Approach for Heterogeneous Cellular Networks , 2015, IEEE Journal on Selected Areas in Communications.

[8]  Andreas Mitschele-Thiel,et al.  Cooperative Fuzzy Q-Learning for self-organized coverage and capacity optimization , 2012, 2012 IEEE 23rd International Symposium on Personal, Indoor and Mobile Radio Communications - (PIMRC).

[9]  Zhifeng Zhao,et al.  The Learning and Prediction of Application-Level Traffic Data in Cellular Networks , 2016, IEEE Transactions on Wireless Communications.

[10]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.

[11]  Xuemin Shen,et al.  Energy-Aware Traffic Offloading for Green Heterogeneous Networks , 2016, IEEE Journal on Selected Areas in Communications.

[12]  Yong Li,et al.  Big Data Driven Mobile Traffic Understanding and Forecasting: A Time Series Approach , 2016, IEEE Transactions on Services Computing.

[13]  Zhu Han,et al.  User Scheduling and Resource Allocation in HetNets With Hybrid Energy Supply: An Actor-Critic Reinforcement Learning Approach , 2018, IEEE Transactions on Wireless Communications.

[14]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[15]  Muhammad Ali Imran,et al.  Challenges in 5G: how to empower SON with big data for enabling 5G , 2014, IEEE Network.

[16]  Peter Dayan,et al.  Q-learning , 1992, Machine Learning.

[17]  Pedro Sousa,et al.  Multi‐scale Internet traffic forecasting using neural networks and time series methods , 2010, Expert Syst. J. Knowl. Eng..

[18]  Hyundong Shin,et al.  Energy Efficient Heterogeneous Cellular Networks , 2013, IEEE Journal on Selected Areas in Communications.

[19]  Fei-Yue Wang,et al.  Traffic Flow Prediction With Big Data: A Deep Learning Approach , 2015, IEEE Transactions on Intelligent Transportation Systems.

[20]  Jian Zhang,et al.  Traffic Offloading in Two-Tier Multi-Mode Small Cell Networks over Unlicensed Bands: A Hierarchical Learning Framework , 2015, KSII Trans. Internet Inf. Syst..

[21]  Zhifeng Zhao,et al.  The predictability of cellular networks traffic , 2012, 2012 International Symposium on Communications and Information Technologies (ISCIT).

[22]  Long-Ji Lin,et al.  Reinforcement learning for robots using neural networks , 1992 .

[23]  Zhu Han,et al.  Self-Organization in Small Cell Networks: A Reinforcement Learning Approach , 2013, IEEE Transactions on Wireless Communications.

[24]  Marco De Nadai,et al.  A multi-source dataset of urban life in the city of Milan and the Province of Trentino , 2015, Scientific Data.

[25]  Zhu Han,et al.  Machine Learning Paradigms for Next-Generation Wireless Networks , 2017, IEEE Wireless Communications.

[26]  A. Liu,et al.  Characterizing and modeling internet traffic dynamics of cellular devices , 2011, PERV.

[27]  Hamid Aghvami,et al.  A survey on mobile data offloading: technical and business perspectives , 2013, IEEE Wireless Communications.

[28]  Ismail Güvenç,et al.  Context-aware mobility management in HetNets: A reinforcement learning approach , 2015, 2015 IEEE Wireless Communications and Networking Conference (WCNC).

[29]  Vasilios A. Siris,et al.  Performance and energy efficiency of mobile data offloading with mobility prediction and prefetching , 2013, 2013 IEEE 14th International Symposium on "A World of Wireless, Mobile and Multimedia Networks" (WoWMoM).

[30]  Wenhao Huang,et al.  Deep Architecture for Traffic Flow Prediction: Deep Belief Networks With Multitask Learning , 2014, IEEE Transactions on Intelligent Transportation Systems.

[31]  Chih-Wei Huang,et al.  A study of deep learning networks on mobile traffic forecasting , 2017, 2017 IEEE 28th Annual International Symposium on Personal, Indoor, and Mobile Radio Communications (PIMRC).

[32]  Di Yuan,et al.  Data Offloading in Load Coupled Networks: A Utility Maximization Framework , 2014, IEEE Transactions on Wireless Communications.

[33]  Marco Ajmone Marsan,et al.  Multiple daily base station switch-offs in cellular networks , 2012, 2012 Fourth International Conference on Communications and Electronics (ICCE).