MentalBERT: Publicly Available Pretrained Language Models for Mental Healthcare

Loading...
Thumbnail Image

Access rights

openAccess

URL

Journal Title

Journal ISSN

Volume Title

A4 Artikkeli konferenssijulkaisussa

Date

2022

Major/Subject

Mcode

Degree programme

Language

en

Pages

7184–7190

Series

Proceedings of the Thirteenth Language Resources and Evaluation Conference

Abstract

Mental health is a critical issue in modern society, and mental disorders could sometimes turn to suicidal ideation without adequate treatment. Early detection of mental disorders and suicidal ideation from social content provides a potential way for effective social intervention. Recent advances in pretrained contextualized language representations have promoted the development of several domain-specific pretrained models and facilitated several downstream applications. However, there are no existing pretrained language models for mental healthcare. This paper trains and releases two pretrained masked language models, i.e., MentalBERT and MentalRoBERTa, to benefit machine learning for the mental healthcare research community. Besides, we evaluate our trained domain-specific models and several variants of pretrained language models on several mental disorder detection benchmarks and demonstrate that language representations pretrained in the target domain improve the performance of mental health detection tasks.

Description

Keywords

Other note

Citation

Ji, S, Zhang, T, Ansari, L, Fu, J, Tiwari, P & Cambria, E 2022, MentalBERT: Publicly Available Pretrained Language Models for Mental Healthcare . in N Calzolari, F Bechet, P Blache, K Choukri, C Cieri, T Declerck, S Goggi, H Isahara, B Maegaard, H Mazo, H Odijk & S Piperidis (eds), LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION . European language resources distribution agency, pp. 7184–7190, International Conference on Language Resources and Evaluation, Marseille, France, 20/06/2022 . < https://aclanthology.org/2022.lrec-1.778 >