Reasoning with Knowledge Graphs for Question Answering

Master Thesis

Question answering (QA) models must be capable of obtaining appropriate knowledge and reason over it (e.g., multi-hop reasoning). Typically, knowledge can be implicitly encoded in large pretrained language models (PLMs), or explicitly represented in structured knowledge graphs (KGs), such as DBPedia and ConceptNet, where nodes are entities and edges represent relations between them. PLMs have comprehensive coverage of knowledge, but they do not empirically perform well on structured reasoning. On the other hand, KGs are more suited for structured reasoning. This project aims to develop and research deep learning techniques to answer questions using knowledge from pre-trained language models (PLMs) and knowledge graphs (KGs).