A Collaborative AI-enabled Pretrained Language Model for AIoT Domain Question Answering

Loading...
Thumbnail Image

Access rights

openAccess

URL

Journal Title

Journal ISSN

Volume Title

A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä

Date

2022-05

Major/Subject

Mcode

Degree programme

Language

en

Pages

Series

IEEE Transactions on Industrial Informatics

Abstract

Large-scale knowledge in the artificial intelligence of things (AIoT) field urgently needs effective models to understand human language and automatically answer questions. Pretrained language models achieve state-of-the-art performance on some question answering (QA) datasets, but few models can answer questions on AIoT domain knowledge. Currently, the AIoT domain lacks sufficient QA datasets and large-scale pretraining corpora. In this article, we propose RoBERTa_ AIoT to address the problem of the lack of high-quality large-scale labeled AIoT QA datasets. We construct an AIoT corpus to further pretrain RoBERTa and BERT. RoBERTa_ AIoT and BERT_ AIoT leverage unsupervised pretraining on a large corpus composed of AIoT-oriented Wikipedia webpages to learn more domain-specific context and improve performance on the AIoT QA tasks. To fine-tune and evaluate the model, we construct three AIoT QA datasets based on the community QA websites. We evaluate our approach on these datasets, and the experimental results demonstrate the significant improvements of our approach.

Description

Keywords

AIoT, Question answering, RoBERTa, BERT, Domain-specific

Other note

Citation

Zhu, H, Tiwari, P, Ghoneim, A & Hossain, M S 2022, ' A Collaborative AI-enabled Pretrained Language Model for AIoT Domain Question Answering ', IEEE Transactions on Industrial Informatics, vol. 18, no. 5, pp. 3387-3396 . https://doi.org/10.1109/TII.2021.3097183