Abstract
Link prediction, a key technique for knowledge graph completion, has advanced with transformer-based encoders utilizing high-dimensional embeddings and self-attention mechanisms. However, these approaches often result in models with excessive parameters, poor scalability, and substantial computational demands, limiting their practical applicability. To address these limitations, this paper introduces a low-dimensional link prediction model that leverages cross-attention for improved efficiency and scalability. Our approach employs low-dimensional embeddings to capture essential, non-redundant information about entities and relations, significantly reducing computational and memory requirements. Unlike self-attention, which models interactions within a single set of embeddings, cross-attention in our model captures complex interactions between entities and relations in a compact, low-dimensional space. Additionally, a streamlined decoding method simplifies computations, reducing processing time without compromising accuracy. Experimental results show that our model outperforms most state-of-the-art link prediction models on two public datasets, WN18RR and FB15k-237. Compared to these top-performing methods, our model contains only 18.1 % and 25.4 % of the parameters of these comparable models, while incurring a performance loss of merely 2.4 % and 3.1 %, respectively. Furthermore, it achieves an average 72 % reduction in embedding dimensions compared to five leading models. A case study on drug repurposing further illustrates the model's potential for real-world applications in knowledge graph completion.
Original language | English |
---|---|
Article number | 113562 |
Number of pages | 11 |
Journal | Knowledge-Based Systems |
Volume | 319 |
Early online date | 01 May 2025 |
DOIs | |
Publication status | Published - 15 Jun 2025 |
Keywords
- low-dimensional
- cross-attention model
- link prediction
- drug repurposing