Contrastive Language-Entity Pre-training for Richer Knowledge Graph Embedding

Date

Authors

Papaluca, Andrea
Krefl, Daniel
Lensky, Artem
Suominen, Hanna

Journal Title

Journal ISSN

Volume Title

Publisher

Springer Science+Business Media B.V.

Access Statement

Research Projects

Organizational Units

Journal Issue

Abstract

In this work we propose a pretraining procedure that aligns a graph encoder and a text encoder to learn a common multi-modal graph-text embedding space. The alignment is obtained by training a model to predict the correct associations between Knowledge Graph nodes and their corresponding descriptions. We test the procedure with two popular Knowledge Bases: Wikidata (formerly Freebase) and YAGO. Our results indicate that such a pretraining method allows for link prediction without the need for additional fine-tuning. Furthermore, we demonstrate that a graph encoder pretrained on the description matching task allows for improved link prediction performance after fine-tuning, without the need for providing node descriptions as additional inputs. We make available the code used in the experiments on GitHub(https://github.com/BrunoLiegiBastonLiegi/CLEP) under the MIT license to encourage further work.

Description

Citation

Source

Book Title

Pattern Recognition and Artificial Intelligence - 4th International Conference, ICPRAI 2024, Proceedings

Entity type

Publication

Access Statement

License Rights

Restricted until