Conversational Question Answering Using a Shift of Context

Recent developments in conversational AI and Speech recognition have seen an explosion of conversational systems such as Google Assistant and Amazon Alexa which can perform a wide range of tasks such as providing weather information, making appointments etc. and can be accessed from smart phones or smart speakers. Chatbots are also widely used in industry for answering employee FAQs or for providing call center support. Question Answering over Linked Data (QALD) is a field that has been intensively researched in the recent years and QA systems have been successful in implementing a natural language interface to DBpedia. However, these systems expect users to phrase their questions completely in a single shot. Humans on the other hand tend to converse by asking a series of interconnected questions or follow-up questions in order to gather information about a topic. With this paper, we present a conversational speech interface for QA, where users can pose questions in both text and speech to query DBpedia entities and converse in form of a natural dialog by asking follow-up questions. We also contribute a benchmark for contextual question answering over Linked Data consisting of 50 conversations with 115 questions.