VESSELS: Efficient and Scalable Deep Learning Prediction on Trusted Processors

Publication Date: 10/21/2020

Event: ACM Symposium on Cloud Computing 2020 (SoCC 2020)

Reference: 1-15, 2020

Authors: Kyungtae Kim, Purdue University; Chung Hwan Kim, NEC Laboratories America, Inc.; Junghwan Rhee, NEC Laboratories America, Inc.; Xiao Yu, NEC Laboratories America, Inc.; Haifeng Chen, NEC Laboratories America, Inc.; Dave (Jing) Tian, Purdue University; Byoungyoung Lee, Seoul National University

Abstract: Deep learning systems on the cloud are increasingly targeted by attacks that attempt to steal sensitive data. Intel SGX has been proven effective to protect the confidentiality and integrity of such data during computation. However, state-of-the-art SGX systems still suffer from substantial performance overhead induced by the limited physical memory of SGX. This limitation significantly undermines the usability of deep learning systems due to their memory-intensive characteristics.In this paper, we provide a systematic study on the inefficiency of the existing SGX systems for deep learning prediction with a focus on their memory usage. Our study has revealed two causes of the inefficiency in the current memory usage paradigm: large memory allocation and low memory reusability. Based on this insight, we present Vessels, a new system that addresses the inefficiency and overcomes the limitation on SGX memory through memory usage optimization techniques. Vessels identifies the memory allocation and usage patterns of a deep learning program through model analysis and creates a trusted execution environment with an optimized memory pool, which minimizes the memory footprint with high memory reusability. Our experiments demonstrate that, by significantly reducing the memory footprint and carefully scheduling the workloads, Vessels can achieve highly efficient and scalable deep learning prediction while providing strong data confidentiality and integrity with SGX.

Publication Link: https://dl.acm.org/doi/10.1145/3419111.3421282