Most knowledge graphs are incomplete. Neural link predictors (most knowledge graph embeddings) can accurately infer of missing knowledge even involving multi-hop reasoning.
In this thesis, the student will focus on techniques that combine LLMs and neural link predictors in the context of retrieval augmented generation (RAG). Through designing a novel and effective model, we aim to achieve the following workflow
The student will closely work on dice-embeddings and a LLM provided by us.
A working simple example:
Graph={("ComputerScientist","subclass","Scientist"), ("Scientist","subclass","Person"),("CaglarDemir","type","ComputerScientist")}
trained_kge=KGE().train(G)
user_query="What is the occupation of Caglar?"
llm_endpoint=""
response=students_work(user_query, trained_kge, llm_endpoint)
"""
response ~ Caglar Demir is a Computer Scientist.
"""
In case you have further questions, feel free to contact Caglar Demir.