Skip to main content
eScholarship
Open Access Publications from the University of California

CogTrans: A Cognitive Transfer Learning-based Self-Attention Mechanism Architecture for Knowledge Graph Reasoning

Creative Commons 'BY' version 4.0 license
Abstract

Knowledge Graph Reasoning (KGR) is an effective method to solve the incompleteness and sparsity problems of Knowledge Graph (KG), which infers new knowledge based on existing knowledge. Especially, the Graph Convolution Network (GCN)-based approaches can obtain state-of-the-art effectiveness, but there are still some problems such as weak reasoning ability, incomplete local information acquisition, insufficient attention score, and high learning cost, which lead to limited prediction accuracy. This paper proposes a multi-head self-attention mechanism architecture based on cognitive transfer learning, named CogTrans, to make effective improvements in the above problems. Shaped like a cross, CogTrans horizontally includes intuition and reasoning stages, which can achieve a faster convergence rate and obtain prediction results that are more in line with human intuition. Furthermore, CogTrans longitudinally includes source and target domains, and benefit from transfer learning, it can not only obtain the advantages of the horizontal architecture but also can “draw inferences from one instance”, which is more conducive to realizing the human brain-like reasoning effect of the architecture. Extensive experimental results show that our CogTrans architecture can obtain the most advanced accuracy of current GCN-based methods.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View