Austere Flash Caching with Deduplication and Compression

Publication Date: 7/15/2020

Event: 2020 USENIX Annual Technical Conference (USENIX ATC ’20)

Reference: pp. 713-726, 2020

Authors: Qiuping Wang, The Chinese University of Hong Kong; Jinhong Li, The Chinese University of Hong Kong; Wen Xia, NEC Laboratories America, Inc., Harbin Institute of Technology, Shenzhen; Erik Kruus, NEC Laboratories America, Inc.; Biplob Debnath, NEC Laboratories America, Inc.; Patrick P.C. Lee, The Chinese University of Hong Kong

Abstract: Modern storage systems leverage flash caching to boost I/O performance, and enhancing the space efficiency and endurance of flash caching remains a critical yet challenging issue in the face of ever-growing data-intensive workloads. Deduplication and compression are promising data reduction techniques for storage and I/O savings via the removal of duplicate content, yet they also incur substantial memory overhead for index management. We propose AustereCache, a new flash caching design that aims for memory-efficient indexing, while preserving the data reduction benefits of deduplication and compression. AustereCache emphasizes austere cache management and proposes different core techniques for efficient data organization and cache replacement, so as to eliminate as much indexing metadata as possible and make lightweight in-memory index structures viable. Trace-driven experiments show that our AustereCache prototype saves 69.9-97.0% of memory usage compared to the state-of-the-art flash caching design that supports deduplication and compression, while maintaining comparable read hit ratios and write reduction ratios and achieving high I/O throughput.

Publication Link: