A low-dimensional cross-attention model for link prediction with applications to drug repurposing

Geng-jing Chen, Gong-de Guo, S. Lorraine Martin, Hui Wang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Downloads (Pure)

Abstract

Link prediction, a key technique for knowledge graph completion, has advanced with transformer-based encoders utilizing high-dimensional embeddings and self-attention mechanisms. However, these approaches often result in models with excessive parameters, poor scalability, and substantial computational demands, limiting their practical applicability. To address these limitations, this paper introduces a low-dimensional link prediction model that leverages cross-attention for improved efficiency and scalability. Our approach employs low-dimensional embeddings to capture essential, non-redundant information about entities and relations, significantly reducing computational and memory requirements. Unlike self-attention, which models interactions within a single set of embeddings, cross-attention in our model captures complex interactions between entities and relations in a compact, low-dimensional space. Additionally, a streamlined decoding method simplifies computations, reducing processing time without compromising accuracy. Experimental results show that our model outperforms most state-of-the-art link prediction models on two public datasets, WN18RR and FB15k-237. Compared to these top-performing methods, our model contains only 18.1 % and 25.4 % of the parameters of these comparable models, while incurring a performance loss of merely 2.4 % and 3.1 %, respectively. Furthermore, it achieves an average 72 % reduction in embedding dimensions compared to five leading models. A case study on drug repurposing further illustrates the model's potential for real-world applications in knowledge graph completion.

Original languageEnglish
Article number113562
Number of pages11
JournalKnowledge-Based Systems
Volume319
Early online date01 May 2025
DOIs
Publication statusPublished - 15 Jun 2025

Keywords

  • low-dimensional
  • cross-attention model
  • link prediction
  • drug repurposing

Fingerprint

Dive into the research topics of 'A low-dimensional cross-attention model for link prediction with applications to drug repurposing'. Together they form a unique fingerprint.

Cite this