Energy efficient mobile computation offloading via online prefetching

Conventional mobile computation offloading relies on offline prefetching that fetches user-specific data to the cloud prior to computing. For computing depending on real-time inputs, the offline operation can result in fetching large volumes of redundant data over wireless channels and unnecessarily consumes mobile-transmission energy. To address this issue, we propose the novel technique of online prefetching for a large-scale program with numerous tasks, which seamlessly integrates task-level computation prediction and real-time prefetching within the program runtime. The technique not only reduces mobile-energy consumption by avoiding excessive fetching but also shortens the program runtime by parallel fetching and computing enabled by prediction. By modeling the sequential task transition in an offloaded program as a Markov chain, stochastic optimization is applied to design the online-fetching policies to minimize mobile-energy consumption for transmitting fetched data over fading channels under a deadline constraint. The optimal policies for slow and fast fading are shown to have a similar threshold-based structure that selects candidates for the next task by applying a threshold on their likelihoods and furthermore uses them controlling the corresponding sizes of prefetched data. In addition, computation prediction for online prefetching is shown theoretically to always achieve energy reduction.

[1]  Sergio Barbarossa,et al.  Joint Optimization of Radio and Computational Resources for Multicell Mobile-Edge Computing , 2014, IEEE Transactions on Signal and Information Processing over Networks.

[2]  Bo Li,et al.  eTime: Energy-efficient transmission between cloud and mobile devices , 2013, 2013 Proceedings IEEE INFOCOM.

[3]  Kaibin Huang,et al.  Live Prefetching for Mobile Computation Offloading , 2016, IEEE Transactions on Wireless Communications.

[4]  Dusit Niyato,et al.  A Framework for Cooperative Resource Management in Mobile Cloud Computing , 2013, IEEE Journal on Selected Areas in Communications.

[5]  Wenzhong Li,et al.  Efficient Multi-User Computation Offloading for Mobile-Edge Cloud Computing , 2015, IEEE/ACM Transactions on Networking.

[6]  Kaibin Huang,et al.  Energy-Efficient Resource Allocation for Mobile-Edge Computation Offloading , 2016, IEEE Transactions on Wireless Communications.

[7]  Aditya Dua,et al.  Adaptive Prefetching in Wireless Computing , 2016, IEEE Transactions on Wireless Communications.

[8]  Yung-Hsiang Lu,et al.  Cloud Computing for Mobile Users: Can Offloading Computation Save Energy? , 2010, Computer.

[9]  Kaibin Huang,et al.  Energy Efficient Mobile Cloud Computing Powered by Wireless Energy Transfer , 2015, IEEE Journal on Selected Areas in Communications.

[10]  Haiyun Luo,et al.  Energy-Optimal Mobile Cloud Computing under Stochastic Wireless Channel , 2013, IEEE Transactions on Wireless Communications.

[11]  Khaled Ben Letaief,et al.  Dynamic Computation Offloading for Mobile-Edge Computing With Energy Harvesting Devices , 2016, IEEE Journal on Selected Areas in Communications.