A Physical Embedding Model for Knowledge Graphs
2 months ago by Caglar Demir
Knowledge graph embedding methods (KGEs) learn continuous vector representations for entities and relations in knowledge graphs (KGs). Applications of KGE methods include collective machine learning, type prediction, link prediction, entity resolution, knowledge graph completion and question answering. In our work, we focus on type prediction. We present a novel approach for KGE, which goes beyond the state of the art w.r.t. both efficiency and effectiveness. Our approach, dubbed PYKE, combines a physical model (based on Hooke's law) with an optimization technique inspired by Simulated Annealing. PYKE scales to large KGs by achieving a linear space complexity while being close to linear in its time complexity on large KGs. We compare the performance of PYKE with that of six state-of-the-art approaches on two tasks, i.e., clustering and type prediction w.r.t. both runtime and prediction accuracy. Our results corroborate our formal analysis of PYKE and suggest that our approach scales close to linearly with the size of the input graph w.r.t. its runtime. In addition to outperforming the state of the art w.r.t. runtime, PYKE also achieves better cluster purity and type prediction scores.
Did you find it interesting ? If yes, then check out: