CAT: Beyond Efficient Transformer for Content-Aware Anomaly Detection in Event Sequences
Publication Date: 8/18/2022
Event: 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
Reference: pp. 4541-4550, 2022
Authors: Shengming Zhang, Rutgers University; Yanchi Liu, NEC Laboratories America, Inc.; Xuchao Zhang, NEC Laboratories America, Inc.; Wei Cheng, NEC Laboratories America, Inc.; Haifeng Chen, NEC Laboratories America, Inc.; Hui Xiong, Rutgers University
Abstract: It is critical and important to detect anomalies in event sequences, which becomes widely available in many application domains. Indeed, various efforts have been made to capture abnormal patterns from event sequences through sequential pattern analysis or event representation learning. However, existing approaches usually ignore the semantic information of event content. To this end, in this paper, we propose a self-attentive encoder-decoder transformer framework, Content-Aware Transformer CAT, for anomaly detection in event sequences. In CAT, the encoder learns preamble event sequence representations with content awareness, and the decoder embeds sequences under detection into a latent space, where anomalies are distinguishable. Specifically, the event content is first fed to a content-awareness layer, generating representations of each event. The encoder accepts preamble event representation sequence, generating feature maps. In the decoder, an additional token is added at the beginning of the sequence under detection, denoting the sequence status. A one-class objective together with sequence reconstruction loss is collectively applied to train our framework under the label efficiency scheme. Furthermore, CAT is optimized under a scalable and efficient setting. Finally, extensive experiments on three real-world datasets demonstrate the superiority of CAT.
Publication Link: https://dl.acm.org/doi/10.1145/3534678.3539155