Knowledge graph embedding methods learn continuous vector representations for entities in knowledge graphs and have been successfully applied for many applications including link prediction. Over the last decade, knowledge graph embedding research has mainly focused on the two smallest normed division algebras. Recent results suggest that using quaternions can lead to better embeddings in terms of their performance on the link prediction task .
Constraints capture domain knowledge, e.g., specified by human experts, and might express that two knowledge graph entities are the same, are of the same type, share a certain property or are different. Common formalism to express such constraints are W3C's shape constraint language (SHACL) and OWL's "same as" and "different from" relations. For instance, Ding et al.  showed that using non-negativity constraints on entity embeddings can be an effective constraint to improve the expresiveness of latent representations.
In this thesis, the student should investigate the potential of using constraints to improve hypercomplex knowledge graph embeddings. The task of the student is: