MentalBERT: Publicly Available Pretrained Language Models for Mental Healthcare

dc.contributorAalto-yliopistofi
dc.contributorAalto Universityen
dc.contributor.authorJi, Shaoxiongen_US
dc.contributor.authorZhang, Tianlinen_US
dc.contributor.authorAnsari, Lunaen_US
dc.contributor.authorFu, Jieen_US
dc.contributor.authorTiwari, Prayagen_US
dc.contributor.authorCambria, Eriken_US
dc.contributor.departmentDepartment of Computer Scienceen
dc.contributor.editorCalzolari, Nen_US
dc.contributor.editorBechet, Fen_US
dc.contributor.editorBlache, Pen_US
dc.contributor.editorChoukri, Ken_US
dc.contributor.editorCieri, Cen_US
dc.contributor.editorDeclerck, Ten_US
dc.contributor.editorGoggi, Sen_US
dc.contributor.editorIsahara, Hen_US
dc.contributor.editorMaegaard, Ben_US
dc.contributor.editorMazo, Hen_US
dc.contributor.editorOdijk, Hen_US
dc.contributor.editorPiperidis, Sen_US
dc.contributor.groupauthorProfessorship Marttinen P.en
dc.contributor.organizationDepartment of Computer Scienceen_US
dc.contributor.organizationUniversity of Manchesteren_US
dc.contributor.organizationAalto Universityen_US
dc.contributor.organizationMila - Quebec Artificial Intelligence Instituteen_US
dc.contributor.organizationNanyang Technological Universityen_US
dc.date.accessioned2022-12-07T07:22:48Z
dc.date.available2022-12-07T07:22:48Z
dc.date.issued2022en_US
dc.description.abstractMental health is a critical issue in modern society, and mental disorders could sometimes turn to suicidal ideation without adequate treatment. Early detection of mental disorders and suicidal ideation from social content provides a potential way for effective social intervention. Recent advances in pretrained contextualized language representations have promoted the development of several domain-specific pretrained models and facilitated several downstream applications. However, there are no existing pretrained language models for mental healthcare. This paper trains and releases two pretrained masked language models, i.e., MentalBERT and MentalRoBERTa, to benefit machine learning for the mental healthcare research community. Besides, we evaluate our trained domain-specific models and several variants of pretrained language models on several mental disorder detection benchmarks and demonstrate that language representations pretrained in the target domain improve the performance of mental health detection tasks.en
dc.description.versionPeer revieweden
dc.format.extent7184–7190
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationJi, S, Zhang, T, Ansari, L, Fu, J, Tiwari, P & Cambria, E 2022, MentalBERT: Publicly Available Pretrained Language Models for Mental Healthcare . in N Calzolari, F Bechet, P Blache, K Choukri, C Cieri, T Declerck, S Goggi, H Isahara, B Maegaard, H Mazo, H Odijk & S Piperidis (eds), LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION . European language resources distribution agency, pp. 7184–7190, International Conference on Language Resources and Evaluation, Marseille, France, 20/06/2022 . < https://aclanthology.org/2022.lrec-1.778 >en
dc.identifier.isbn979-10-95546-72-6
dc.identifier.otherPURE UUID: aeb140a3-68c1-4eda-b9de-43b11672d68fen_US
dc.identifier.otherPURE ITEMURL: https://research.aalto.fi/en/publications/aeb140a3-68c1-4eda-b9de-43b11672d68fen_US
dc.identifier.otherPURE LINK: https://aclanthology.org/2022.lrec-1.778en_US
dc.identifier.otherPURE FILEURL: https://research.aalto.fi/files/93556807/MentalBERT_Publicly_Available_Pretrained_Language_Models_for_Mental_Healthcare.pdfen_US
dc.identifier.urihttps://aaltodoc.aalto.fi/handle/123456789/118053
dc.identifier.urnURN:NBN:fi:aalto-202212076798
dc.language.isoenen
dc.relation.ispartofInternational Conference on Language Resources and Evaluationen
dc.relation.ispartofseriesProceedings of the Thirteenth Language Resources and Evaluation Conferenceen
dc.rightsopenAccessen
dc.titleMentalBERT: Publicly Available Pretrained Language Models for Mental Healthcareen
dc.typeA4 Artikkeli konferenssijulkaisussafi
dc.type.versionpublishedVersion

Files