This repository contains PyTorch implementation of R-TuckER model for knowledge graph link prediction task.
The proposed method is a modification of the approach described in paper [1]. It represents the knowledge graph as a tensor with a fixed SFT-rank and uses the Riemannian optimization approach for training. See details in [2].
There are two types of models:
asymetric
-- model uses a regular Tucker decomposition for representing the knowledge graph, with distinct embeddings for the subjects and objects of each fact (fundamental records of the knowledge graph).symmetric
-- model shares subjects and objects embeddings and utilizes SF-Tucker decomposition.
Note that unlike TuckER model, this approach doesn't employ such common DL techniques as Dropout or BatchNormalization.
Dataset | MRR | Hits@10 | Hits@3 | Hits@1 |
---|---|---|---|---|
WN18RR | 0.479 | 0.546 | 0.492 | 0.446 |
FB15k-237 | 0.329 | 0.505 | 0.359 | 0.242 |
All main hyperparameters may be setuped in file configs/base_config.py
. There are also som command line properties:
mode
eighersymmetric
orasymetric
seed
random seednw
num_workersdevice
eithercpu
orcuda
optim
eitherrgd
,rsgd
oradam
data
path to dataset
To reproduce the results from the paper, use the following combinations of hyperparameters with batch_size=512
:
dataset | rank (rel, ent) | lr | lr_decay | reg_strategy | reg_init | reg_finish | reg_steps | momentum | label_smoothing | num_epochs |
---|---|---|---|---|---|---|---|---|---|---|
WN18RR | (10, 200) | 2000 | .9981 | "exp" | 1e-4 | 3e-9 | 350 | 0.8 | 0.1 | 1450 |
FB15k-237 | (200, 20) | 2000 | .9981 | "exp" | 1e-4 | 1e-10 | 100 | 0.8 | 0.1 | 1450 |
Also use the following command line options:
python train.py --mode asymmetric --nw <NW> --seed 322 --data data/<dataset>/ --optim rsgd
This implementation requires the following packages
torch==1.13.1
Also custom library tucker_riemopt
of version at least 1.0.1 is required.
TuckER: Tensor Factorization for Knowledge Graph Completion
[TBA]
MIT License