Enabling conversational question answering using a Bidirectional Encoder Representations from Transformers (BERT) based model
thesisposted on 2022-03-28, 22:56 authored by Munazza Zaib
As one promising way to inquire about any particular information through a dialog with the bot, question answering dialog systems have gained increasing research interest recently. However, such systems often struggle when the dialog is carried out in multiple turns by the users to seek more information based on what they have already learned. Although several works have dealt with the issue of history modeling in multi-turn question answering, most have focused on either prepending the history questions and answers, employing complex attention mechanism or not capturing the entire context of the history conversation. To address this issue, we propose a BERT based Conversational Question Answering in Context (BERT-CoQAC) that involves the seamless integration of the relevant context of conversational history into a BERT-based question answering model. This study proposes a framework that provides a dynamic history selection process and combines the embeddings of history answers along with the history questions to generate a complete input in order to capture the understanding of the current query more accurately. To test our approach, we performed extensive experiments and the results implies that having a proper history conversation modeling is necessary to achieve better results. We also studied the effect of not having a history selection mechanism to provide new insights into history conversation modeling.