Survey of Bert-Based Models for Question Answering
Abstract - Question-Answering is an integral part of day-to-day human activity. Humans with their blessed ability to handle reasoning of questions, understanding contextuality and ability to handle semantics perform this task easily. With advancements in Natural Language Processing and transfer learning, machines have come way forward in solving extractive Question Answering tasks in a much more effective way. There are several transformer based pre-trained models coming up which are solving this task with increased effectiveness. In this paper, we perform experiments to compare some of the pre-trained BERT models that solve the extractive QA tasks and try to identify the best performing BERT model for a given dataset. We have used Exact match (EM) and F1-score as the metric to evaluate these pre-trained models.
Keywords - Natural Language Processing, Extractive Question-Answering, Pre-Trained BERT Models, Transfer Learning