Online Learning for IoT Optimization: A Frank–Wolfe Adam-Based Algorithm

Many problems in the Internet of Things (IoT) can be regarded as online optimization problems. For this reason, an online-constrained problem in IoT is considered in this article, where the cost functions change over time. To solve this problem, many projected online optimization algorithms have been widely used. However, the projections of these algorithms become prohibitive in problems involving high-dimensional parameters and massive data. To address this issue, we propose a Frank–Wolfe Adam online learning algorithm called Frank–Wolfe Adam (FWAdam), which uses a Frank–Wolfe method to eschew costly projection operations. Furthermore, we first give the convergence analysis of the FWAdam algorithm, and prove its regret bound to <inline-formula> <tex-math notation="LaTeX">$O(T^{3/4})$ </tex-math></inline-formula> when cost functions are convex, where <inline-formula> <tex-math notation="LaTeX">$T$ </tex-math></inline-formula> is a time horizon. Finally, we present simulated experiments on two data sets to validate our theoretical results.

[1]  Andrea Detti,et al.  Lightweight Named Object: An ICN-Based Abstraction for IoT Device Programming and Management , 2019, IEEE Internet of Things Journal.

[2]  Elad Hazan,et al.  Projection-free Online Learning , 2012, ICML.

[3]  Tong Zhang,et al.  Projection-free Distributed Online Learning in Networks , 2017, ICML.

[4]  Xu Sun,et al.  Adaptive Gradient Methods with Dynamic Bound of Learning Rate , 2019, ICLR.

[5]  Ying Chen,et al.  Dynamic Computation Offloading in Edge Computing for Internet of Things , 2019, IEEE Internet of Things Journal.

[6]  Philip Wolfe,et al.  An algorithm for quadratic programming , 1956 .

[7]  Matthew D. Zeiler ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.

[8]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[9]  Jiawei Zhang,et al.  BGADAM: Boosting based Genetic-Evolutionary ADAM for Convolutional Neural Network Optimization , 2019, ArXiv.

[10]  Lujie Zhong,et al.  Design of Multipath Transmission Control for Information-Centric Internet of Things: A Distributed Stochastic Optimization Framework , 2019, IEEE Internet of Things Journal.

[11]  Satyajayant Misra,et al.  LASeR: Lightweight Authentication and Secured Routing for NDN IoT in Smart Cities , 2017, IEEE Internet of Things Journal.

[12]  Hongke Zhang,et al.  Enhancing Crowd Collaborations for Software Defined Vehicular Networks , 2017, IEEE Communications Magazine.

[13]  Limeng Cui,et al.  GADAM: Genetic-Evolutionary ADAM for Deep Neural Network Optimization , 2018, ArXiv.

[14]  Lijun Zhang,et al.  SAdam: A Variant of Adam for Strongly Convex Functions , 2019, ICLR.

[15]  Y. Nesterov A method for unconstrained convex minimization problem with the rate of convergence o(1/k^2) , 1983 .

[16]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[17]  Jerry Ma,et al.  Quasi-hyperbolic momentum and Adam for deep learning , 2018, ICLR.

[18]  Athanasios V. Vasilakos,et al.  Information-centric networking for the internet of things: challenges and opportunities , 2016, IEEE Network.

[19]  George Michailidis,et al.  DAdam: A Consensus-Based Distributed Adaptive Gradient Method for Online Optimization , 2018, IEEE Transactions on Signal Processing.

[20]  Euhanna Ghadimi,et al.  Global convergence of the Heavy-ball method for convex optimization , 2014, 2015 European Control Conference (ECC).

[21]  Fangchun Yang,et al.  Link Importance Evaluation of Data Center Network Based on Maximum Flow , 2017 .

[22]  Jinbo Bi,et al.  Calibrating the adaptive learning rate to improve convergence of ADAM , 2019, Neurocomputing.

[23]  Amin Karbasi,et al.  Projection-Free Online Optimization with Stochastic Gradient: From Convexity to Submodularity , 2018, ICML.

[24]  Eric Moulines,et al.  On the Online Frank-Wolfe Algorithms for Convex and Non-convex Optimizations , 2015, 1510.01171.

[25]  Mingchuan Zhang,et al.  A Randomized Block-Coordinate Adam online learning optimization algorithm , 2020, Neural Computing and Applications.

[26]  Hongke Zhang,et al.  Adaptive Transmission Control for Software Defined Vehicular Networks , 2019, IEEE Wireless Communications Letters.

[27]  Yoram Singer,et al.  Adaptive Subgradient Methods for Online Learning and Stochastic Optimization , 2011, J. Mach. Learn. Res..

[28]  Ruidong Li,et al.  A Distributed Publisher-Driven Secure Data Sharing Scheme for Information-Centric IoT , 2017, IEEE Internet of Things Journal.

[29]  Reza Malekian,et al.  Software Defined Wireless Sensor Networks (SDWSN): A Review on Efficient Resources, Applications and Technologies , 2018 .

[30]  Timothy Dozat,et al.  Incorporating Nesterov Momentum into Adam , 2016 .

[31]  Shui Yu,et al.  Security and Privacy in the Age of the Smart Internet of Things: An Overview from a Networking Perspective , 2018, IEEE Communications Magazine.

[32]  Ahmed E. Kamal,et al.  NCP: A near ICN Cache Placement Scheme for IoT-Based Traffic Class , 2018, 2018 IEEE Global Communications Conference (GLOBECOM).

[33]  Ning Zhang,et al.  RAV: Relay Aided Vectorized Secure Transmission in Physical Layer Security for Internet of Things Under Active Attacks , 2019, IEEE Internet of Things Journal.

[34]  Sanjiv Kumar,et al.  On the Convergence of Adam and Beyond , 2018 .

[35]  Victor C. M. Leung,et al.  MASM: A Multiple-Algorithm Service Model for Energy-Delay Optimization in Edge Artificial Intelligence , 2019, IEEE Transactions on Industrial Informatics.

[36]  Wei Quan,et al.  Distributed Conditional Gradient Online Learning for IoT Optimization , 2020, IEEE Internet of Things Journal.

[37]  Jiawei Zhang,et al.  DEAM: Accumulated Momentum with Discriminative Weight for Stochastic Optimization , 2019, ArXiv.