Caching refers to the temporary storage of frequently accessed or recently used data in a location that allows for quicker retrieval when the same data is requested again. The purpose of caching is to improve the performance and efficiency of systems by reducing the need to fetch the data from the original source, which may involve more time-consuming operations.

Examples of caching in everyday use include web browsers caching images and scripts, content delivery networks caching static website content, and database query caching in server applications. Caching is a fundamental optimization technique used across various domains to deliver faster and more responsive systems.


Austere Flash Caching with Deduplication and Compression

Modern storage systems leverage flash caching to boost I/O performance, and enhancing the space efficiency and endurance of flash caching remains a critical yet challenging issue in the face of ever-growing data-intensive workloads. Deduplication and compression are promising data reduction techniques for storage and I/O savings via the removal of duplicate content, yet they also incur substantial memory overhead for index management. We propose AustereCache, a new flash caching design that aims for memory-efficient indexing, while preserving the data reduction benefits of deduplication and compression. AustereCache emphasizes austere cache management and proposes different core techniques for efficient data organization and cache replacement, so as to eliminate as much indexing metadata as possible and make lightweight in-memory index structures viable. Trace-driven experiments show that our AustereCache prototype saves 69.9-97.0% of memory usage compared to the state-of-the-art flash caching design that supports deduplication and compression, while maintaining comparable read hit ratios and write reduction ratios and achieving high I/O throughput.