Trustworthy and Context-Aware Distributed Online Learning With Autoscaling for Content Caching in Collaborative Mobile Edge Computing
暂无分享,去创建一个
Content caching is widely recognized a promising functionality to improve service performance in mobile edge computing (MEC). In the big data era, there are massive heterogeneous contents collected by the mobile devices, belonging to different users with specific context (e.g., hobby, environment, age, etc). However, local content caching without content popularity and context information in advance is not accurate enough. Especially, multiple large-scale contents cached in the local database bring high pressure to the process of content selection. Hence, to handle these important issues, we propose a context-aware distributed online learning algorithm for efficient content caching according to a novel tree-based and contextual multi-arm bandit theory for collaborative MEC in this paper. To guarantee the trustworthy collaboration, we introduce a trust evaluation factor to find reliable neighboring ENs. Moreover, our system extracts contextual information from users into the context space and builds up a content cover tree to maximize caching hit rates to satisfy users’ demands. Our simulation results based on a real-world dataset indicate that our proposal can achieve a balance between caching hit rates and time cost, and have a sublinear bound of cumulative regret. This verifies its superior caching-hits performance gain compared to the other related algorithms.