Publication Date: 9/23/2022
Event: ECML PKDD 2022 – European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, Grenoble, France
Reference: pp. 155-171, 2022
Authors: Junheng Hao, University of California, Los Angeles; Lu-An Tang, NEC Laboratories America, Inc.; Zhengzhang Chen, NEC Laboratories America, Inc.; Yizhou Sun, University of California, Los Angeles; Junghwan Rhee, University of Central Oklahoma; Zhichun Li, Stellar Cyber; Wei Wang, University of California, Los Angeles
Abstract: Large-scale information systems, such as knowledge graphs (KGs), enterprise system networks, often exhibit dynamic and complex activities. Recent research has shown that formalizing these information systems as graphs can effectively characterize the entities (nodes) and their relationships (edges). Transferring knowledge from existing well-curated source graphs can help construct the target graph of newly-deployed systems faster and better which no doubt will benefit downstream tasks such as link prediction and anomaly detection for new systems. However, current graph transferring methods are either based on a single source, which does not sufficiently consider multiple available sources, or not selectively learns from these sources. In this paper, we propose MSGT-GNN, a graph knowledge transfer model for efficient graph link prediction from multiple source graphs. MSGT-GNN consists of two components: the Intra-Graph Encoder, which embeds latent graph features of system entities into vectors, and the graph transferor, which utilizes graph attention mechanism to learn and optimize the embeddings of corresponding entities from multiple source graphs, in both node level and graph level. Experimental results on multiple real-world datasets from various domains show that MSGT-GNN outperforms other baseline approaches in the link prediction and demonstrate the merit of attentive graph knowledge transfer and the effectiveness of MSGT-GNN.
Publication Link: https://link.springer.com/chapter/10.1007/978-3-031-26390-3_10