GreenTE.ai: Power-Aware Traffic Engineering via Deep Reinforcement Learning

Power-aware traffic engineering via coordinated sleeping is usually formulated into Integer Programming problems, which are generally NP-hard with unbounded computation time for large-scale networks. This results in delayed control decision making in dynamic network environments. Motivated by advances in deep Reinforcement Learning, we consider building intelligent systems that learn to adaptively change router/switch’s power state according to changing network conditions. Neural network’s forward propagation can greatly speed up power on/off decision making. Generally, conducting RL requires a learning agent to iteratively explore and perform the "good" actions based on the feedback from the environment. By coupling Software-Defined Networking for performing centrally calculated actions to the environment and In-band Network Telemetry for collecting feedback from the environment, we develop GreenTE.ai, a closed-loop control/training system to automate power-aware traffic engineering. Furthermore, we propose novel techniques to enhance the learning ability and reduce the learning complexity. With both energy efficiency and traffic load balancing considered, GreenTE.ai can generate reasonable power saving actions within 276ms under a network testbed of 11 software P4 switches.