← Go back

Hypercomplex Knowledge Graph Embeddings with Constraints

Master Thesis


Knowledge graph embedding methods learn continuous vector representations for entities in knowledge graphs and have been successfully applied for many applications including link prediction. Over the last decade, knowledge graph embedding research has mainly focused on the two smallest normed division algebras. Recent results suggest that using quaternions can lead to better embeddings in terms of their performance on the link prediction task [1].

Constraints capture domain knowledge, e.g., specified by human experts, and might express that two knowledge graph entities are the same, are of the same type, share a certain property or are different. Common formalism to express such constraints are W3C's shape constraint language (SHACL) and OWL's "same as" and "different from" relations. For instance, Ding et al. [2] showed that using non-negativity constraints on entity embeddings can be an effective constraint to improve the expresiveness of latent representations.

In this thesis, the student should investigate the potential of using constraints to improve hypercomplex knowledge graph embeddings. The task of the student is:

  1. Choose a formalism to specify constraints (e.g., a subset of SHACL).
  2. Combine the hypercomplex embedding model with constraints.
  3. Adapt a standard benchmarking dataset for link prediction enriching it with constraints.
  4. Evaluate the new approach with constraints comparing it to the original approach without constraints.

Required skills

  • Knowledge Graph Embedding
  • RDF knowledge graphs
  • Machine Learning
  • Python / PyTorch / JAX / MXNet / NumPy


[1] Quaternion Knowledge Graph Embeddings (https://arxiv.org/abs/1904.10281)
[2] Improving Knowledge Graph Embedding Using Simple Constraints (https://www.aclweb.org/anthology/P18-1011.pdf)