Layered Learning Radio Resource Management for Energy Harvesting Small Base Stations

Dense deployment of small base stations (SBSs) will play a crucial role in 5G cellular networks for satisfying the expected huge traffic demand. Dynamic ON/OFF switching of SBSs and the use of renewable energies have recently attracted increasing attention to limit the energy consumption of such a network. In this paper, we present a Layered Learning solution for the radio resource management of dense cellular networks with SBSs powered solely by renewable energy. In the first layer, reinforcement learning agents locally select switch ON/OFF policies of the SBSs according to the energy income and the traffic demand. The second layer relies on an Artificial Neural Network that estimates the network load conditions to implement a centralized controller enforcing local agent decisions. Simulation results prove that the proposed layered framework outperforms both a greedy and a completely distributed solution both in terms of throughput and energy efficiency.

[1]  Jeffrey G. Andrews,et al.  Fundamentals of Heterogeneous Cellular Networks with Energy Harvesting , 2013, IEEE Transactions on Wireless Communications.

[2]  Abolfazl Mehbodniya,et al.  Online ski rental for scheduling self-powered, energy harvesting small base stations , 2016, 2016 IEEE International Conference on Communications (ICC).

[3]  Manuela M. Veloso,et al.  Using decision tree confidence factors for multi-agent control , 1998, AGENTS '98.

[4]  Marco Miozzo,et al.  A lightweight and accurate link abstraction model for the simulation of LTE networks in ns-3 , 2012, MSWiM '12.

[5]  Marco Miozzo,et al.  A Lightweight and Accurate Link Abstraction Model for the System-Level Simulation of LTE networks in ns-3 , 2012 .

[6]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .

[7]  Marco Ajmone Marsan,et al.  Towards zero grid electricity networking: Powering BSs with renewable energy sources , 2013, 2013 IEEE International Conference on Communications Workshops (ICC).

[8]  Nouredine Hadjsaid,et al.  Fuzzy Q-Learning based energy management of small cells powered by the smart grid , 2016, 2016 IEEE 27th Annual International Symposium on Personal, Indoor, and Mobile Radio Communications (PIMRC).

[9]  Marco Miozzo,et al.  SolarStat: Modeling photovoltaic sources through stochastic Markov processes , 2013, 2014 IEEE International Energy Conference (ENERGYCON).

[10]  Loutfi Nuaymi,et al.  Renewable energy in cellular networks: A survey , 2013, 2013 IEEE Online Conference on Green Communications (OnlineGreenComm).

[11]  Manuela M. Veloso,et al.  Using Decision Tree Confidence Factors for Multiagent Control , 1997, RoboCup.

[12]  Mohamed-Slim Alouini,et al.  Next-Generation Environment-Aware Cellular Networks: Modern Green Techniques and Implementation Challenges , 2016, IEEE Access.

[13]  Oriol Sallent,et al.  Knowledge-based 5G Radio Access Network planning and optimization , 2016, 2016 International Symposium on Wireless Communication Systems (ISWCS).

[14]  Ramon Casellas,et al.  Innovations through 5G-Crosshaul applications , 2016, 2016 European Conference on Networks and Communications (EuCNC).

[15]  Marco Miozzo,et al.  Switch-On/Off Policies for Energy Harvesting Small Cells through Distributed Q-Learning , 2017, 2017 IEEE Wireless Communications and Networking Conference Workshops (WCNCW).

[16]  Sampath Rangarajan,et al.  EXTREMELY DENSE WIRELESS NETWORKS , 2022 .

[17]  Reinaldo A. C. Bianchi,et al.  Heuristically-Accelerated Multiagent Reinforcement Learning , 2014, IEEE Transactions on Cybernetics.

[18]  Jinsong Wu,et al.  Survey of Strategies for Switching Off Base Stations in Heterogeneous Networks for Greener 5G Systems , 2016, IEEE Access.

[19]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[20]  Anders S. G. Andrae,et al.  On Global Electricity Usage of Communication Technology: Trends to 2030 , 2015 .