Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/11044
Title: Algorithms for mobile edge caching in future wireless networks
Authors: Krishnendu S
Supervisors: Bhatia, Vimal
Keywords: Electrical Engineering
Issue Date: 31-Oct-2022
Publisher: Department of Electrical Engineering, IIT Indore
Series/Report no.: TH471
Abstract: This thesis explores one of the key enablers of 5G wireless networks leveraging small cell network deployment, namely mobile edge caching. Endowed with predictive capabilities and harnessing recent developments in storage, context-awareness and social networks, peak traffic demands can be substantially reduced by caching at the edge of the network. The surge in Internet usage through smart phones, social media platforms and online video streaming has heralded an explosive growth in the amount of data being created. Due to the fast development of communication based applications, it is expected that there will be 5.3 billion total Internet users (66 percent of global population) by 2023, up from 3.9 billion (51 percent of global population) in 2018. This new phenomenon has urged mobile operators to redesign their current networks and seek more advanced and sophisticated techniques to increase coverage, boost network capacity, and cost-effectively bring contents closer to users. A promising approach to meet these unprecedented traffic demands is via the deployment of small cell networks (SCNs). SCNs represent a novel networking paradigm based on the idea of deploying short-range, low-power, and low-cost small base stations (SBSs) underlaying the macro cellular network. In addition to the vast and dynamic mobile data generated, the limited spectrum especially in the wireless link, due to the extensive use of smart phones and other devices, ultimately leads to congestion in the backhaul links. The implementation of dense and SBSs does reduce the latency and provides immense throughput in the 5G and beyond network, but the bottleneck issue still remains the same. Thus, the existing small cell networking paradigm falls short of solving peak traffic demands whose large-scale deployment hinges on expensive site acquisition, installation and backhaul costs. These shortcomings are set to become increasingly acute, due to the surging number of connected devices and the advent of ultra-dense networks, which will continue to strain current cellular network infrastructures. These key observations mandate a novel networking paradigm which goes beyond current heterogeneous small cell deployments leveraging the latest developments in storage, context-awareness, and social networking. This novel paradigm helps in storing the data locally at the edge of the network and is referred to as mobile edge caching. Mobile edge caching which exploits the vast data, compensates for the shortage of local computing capacity and high transmission costs of individual cloud computing. Thus, the demand of the hour is a shift from the large scale cloud data to wide range edge devices. Both edge caching and computation helps in storing and implementation of learning algorithms at the edge, hence making the edge intelligent and further reducing the burden on the backhaul. The standard simplified algorithms such as least frequently used (LFU), least recently used (LRU), least recently/frequently used (LRFU) and other variants, can be inefficient when it comes to dynamic environments, since they do not take into account the correlation and non-stationarity of the demand requests. Therefore, a gradual shift towards learning and optimizing the edge devices for content prediction is observed. This in turn yields significant gains in terms of network resources, minimizing operational and capital expenditures. To strike a balance between increasing mobile traffic and user experience, mobile edge caching and computing can be seen as invariable solution by bringing stor age and computation close to the edge. Further, it is observed that, users who have similar interests and backgrounds tend to rank the content in a similar way. However, in a realistic network, the content popularity tends to be dynamic hence motivating the dynamic cache. Thus borrowing the tools from probability theory, machine learning, we have designed algorithms which take advantage of the memory distributed across the network. The proposed network architecture involves caching popular data sets closer to users, which will yield higher user’s satisfaction; attain high backhaul offloading gains and more revenue for the operators. In the coming years, mobile edge caching is seen as a potential alternative to reduce congestion in the backhaul. Caching contents at the edge of the network not only brings the data closer to the user and makes the content access easier but also gives an opportunity for network service provider to fulfill user demands with limited resources. Recently, there is much work focusing on mobile caching strategies due to the advantages of fast response.
URI: https://dspace.iiti.ac.in/handle/123456789/11044
Type of Material: Thesis_Ph.D
Appears in Collections:Department of Electrical Engineering_ETD

Files in This Item:
File Description SizeFormat 
TH471_Krishnendu_S_701102001.pdf1.97 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: