DSpace logo

Please use this identifier to cite or link to this item: http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16388
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSharma, Yashvardhan-
dc.date.accessioned2024-11-15T07:00:11Z-
dc.date.available2024-11-15T07:00:11Z-
dc.date.issued2020-
dc.identifier.urihttps://aclanthology.org/2020.figlang-1.18/-
dc.identifier.urihttp://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16388-
dc.description.abstractRecent work on automatic sequential metaphor detection has involved recurrent neural networks initialized with different pre-trained word embeddings and which are sometimes combined with hand engineered features. To capture lexical and orthographic information automatically, in this paper we propose to add character based word representation. Also, to contrast the difference between literal and contextual meaning, we utilize a similarity network. We explore these components via two different architectures - a BiLSTM model and a Transformer Encoder model similar to BERT to perform metaphor identification. We participate in the Second Shared Task on Metaphor Detection on both the VUA and TOFEL datasets with the above models. The experimental results demonstrate the effectiveness of our method as it outperforms all the systems which participated in the previous shared task.en_US
dc.language.isoenen_US
dc.publisherAssociation for Computational Linguistics (ACL)en_US
dc.subjectComputer Scienceen_US
dc.subjectBERTen_US
dc.subjectMetaphor Detectionen_US
dc.titleCharacter aware models with similarity learning for metaphor detectionen_US
dc.typeArticleen_US
Appears in Collections:Department of Computer Science and Information Systems

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.