Cost-aware Joint Caching and Forwarding in Networks with Heterogeneous Cache Resources

Caching is crucial for enabling high-throughput networks for data intensive applications. Traditional caching technology relies on DRAM, as it can transfer data at a high rate. However, DRAM capacity is subject to contention by most system components and thus is very limited, implying that DRAM-only...

Full description

Saved in:
Bibliographic Details
Main Authors Mutlu, Faruk Volkan, Yeh, Edmund
Format Journal Article
LanguageEnglish
Published 11.10.2023
Subjects
Online AccessGet full text
DOI10.48550/arxiv.2310.07243

Cover

Loading…
More Information
Summary:Caching is crucial for enabling high-throughput networks for data intensive applications. Traditional caching technology relies on DRAM, as it can transfer data at a high rate. However, DRAM capacity is subject to contention by most system components and thus is very limited, implying that DRAM-only caches cannot scale to meet growing demand. Fortunately, persistent memory and flash storage technologies are rapidly evolving and can be utilized alongside DRAM to increase cache capacities. To do so without compromising network performance requires caching techniques adapted to the characteristics of these technologies. In this paper, we model the cache as a collection of storage blocks with different rate parameters and utilization costs. We introduce an optimization technique based on the drift-plus-penalty method and apply it in a framework which enables joint caching and forwarding. We show that it achieves an optimal trade-off between throughput and cache utilization costs in a virtual control plane. We then develop a corresponding practical policy in the data plane. Finally, through simulations in several settings, we demonstrate the superior performance of our proposed approach with respect to total user delay and cache utilization costs.
DOI:10.48550/arxiv.2310.07243