DSpace logo

Please use this identifier to cite or link to this item: http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/14955
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMitra, Satanik-
dc.date.accessioned2024-05-21T09:09:13Z-
dc.date.available2024-05-21T09:09:13Z-
dc.date.issued2022-
dc.identifier.urihttps://ieeexplore.ieee.org/abstract/document/10037677-
dc.identifier.urihttp://dspace.bits-pilani.ac.in:8080/jspui/xmlui/handle/123456789/14955-
dc.description.abstractSuicidal intention or ideation detection is one of the evolving research fields in social media. People use this platform to share their thoughts, tendencies, opinions, and feelings toward suicide. Therefore, this task becomes a challenging one due to the unstructured and noisy texts. In this paper, we propose five BERT-based pre-trained transformer models, namely, BERT, DistilBERT, ALBERT, RoBERTa, and DistilRoBERTa, for the task of suicidal intention detection. The performance of these models evaluated using the standard classification metrics. Specifically, we use the one-cycle learning rate policy to train all models. Our results show that the RoBERTa model achieves a better performance than other BERT-based models. The model gains 99.23%, 96.35%, and 95.39% accuracy for training, validation, and testing, respectively.en_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.subjectManagementen_US
dc.subjectSuicidal intentionen_US
dc.subjectSuicidal ideationen_US
dc.subjectTransformersen_US
dc.subjectPre-trained modelsen_US
dc.titleSuicidal Intention Detection in Tweets Using BERT-Based Transformersen_US
dc.typeArticleen_US
Appears in Collections:Department of Management

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.