Comparison of Caching Policies in Wireless Networks
We consider a server containing a finite number of files connected to multiple users through a shared link medium. Each user has a finite cache memory and follows certain cache policy. User requests follow Independent Reference Model(IRM) model. Request for a file is serviced locally on a cache hit....
Saved in:
Published in | 2018 International Conference on Emerging Trends and Innovations In Engineering And Technological Research (ICETIETR) pp. 1 - 3 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.07.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We consider a server containing a finite number of files connected to multiple users through a shared link medium. Each user has a finite cache memory and follows certain cache policy. User requests follow Independent Reference Model(IRM) model. Request for a file is serviced locally on a cache hit. If the file is not present in the cache memory, the requests are forwarded to the server. The server consists of multiple request queue separately for each user unlike traditional single queue at the server. Users follow certain caching policy such as Random, FIFO, simple LRU and comparison between them is made for different cache memory sizes. We use average queuing delay of requests at the server as a performance metric for comparison. We introduce probabilistic Least Recently Used(p-LRU) cache policy which uses probability of a file in catalog being requested as a parameter for insertion policy and compare with other cache policies. We draw samples from Zipf distribution to define the popularity profile of each file in p-LRU implementation and performance comparison is made for different request pattern of users. Finally, we have related the hit rate averaged over all the users with average queuing delay for all the cache policies discussed. |
---|---|
DOI: | 10.1109/ICETIETR.2018.8529091 |