Does the magic of BERT apply to medical code assignment? A quantitative study
No Thumbnail Available
Access rights
openAccess
Journal Title
Journal ISSN
Volume Title
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
This publication is imported from Aalto University research portal.
View publication in the Research portal
View/Open full text file from the Research portal
Other link related to publication
View publication in the Research portal
View/Open full text file from the Research portal
Other link related to publication
Date
2021-12
Department
Major/Subject
Mcode
Degree programme
Language
en
Pages
Series
COMPUTERS IN BIOLOGY AND MEDICINE
Abstract
Unsupervised pretraining is an integral part of many natural language processing systems, and transfer learning with language models has achieved remarkable results in downstream tasks. In the clinical application of medical code assignment, diagnosis and procedure codes are inferred from lengthy clinical notes such as hospital discharge summaries. However, it is not clear if pretrained models are useful for medical code prediction without further architecture engineering. This paper conducts a comprehensive quantitative analysis of various contextualized language models' performances, pretrained in different domains, for medical code assignment from clinical notes. We propose a hierarchical fine-tuning architecture to capture interactions between distant words and adopt label-wise attention to exploit label information. Contrary to current trends, we demonstrate that a carefully trained classical CNN outperforms attention-based models on a MIMIC-III subset with frequent codes. Our empirical findings suggest directions for building robust medical code assignment models.Description
| openaire: EC/H2020/101016775/EU//INTERVENE
Keywords
Other note
Citation
Ji, S, Hölttä, M & Marttinen, P 2021, ' Does the magic of BERT apply to medical code assignment? A quantitative study ', Computers in Biology and Medicine, vol. 139, 104998 . https://doi.org/10.1016/j.compbiomed.2021.104998