Multi-document reading comprehension is an important and difficult task in natural language processing. To address the issue that ELECTRA pre-training model has length limitation and cannot be directly adapt to multi-document reading comprehension task, this paper proposes a novel model based on ELECTRA and document sliding windows. In the model multiple documents are split and merged through document sliding windows, new segmentation embedding is introduced, answer position in documents is modelled as a learning target, and ELECTRA is used for joint training in each window. After obtaining all prediction outcomes of each window, the results are comprehensively sorted to achieve the optimal answer. The experiments show that Rouge-L of this model reaches 51.28% on the multi-document reading comprehension dataset MS-MARCO, ranking the current best result.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.