Aspect-based emotion analysis and multimodal coreference : a case study of customer comments on Adidas instagram posts

Publication type
P1
Publication status
Published
Authors
De Bruyne, L., Karimi, A., De Clercq, O., Prati, A., & Hoste, V.
Editor
Nicoletta Calzolari, Béchet Frédéric, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk and Stelios Piperidis
Series
Proceedings of the 13th Conference on Language Resources and Evaluation (LREC 2022)
Pagination
574-580
Publisher
ELRA
Conference
13th Conference on Language Resources and Evaluation (LREC 2022) (Marseille)
Download
(.pdf)
View in Biblio
(externe link)

Abstract

While aspect-based sentiment analysis of user-generated content has received a lot of attention in the past years, emotion detection at the aspect level has been relatively unexplored. Moreover, given the rise of more visual content on social media platforms, we want to meet the ever-growing share of multimodal content. In this paper, we present a multimodal dataset for Aspect-Based Emotion Analysis (ABEA). Additionally, we take the first steps in investigating the utility of multimodal coreference resolution in an ABEA framework. The presented dataset consists of 4,900 comments on 175 images and is annotated with aspect and emotion categories and the emotional dimensions of valence and arousal. Our preliminary experiments suggest that ABEA does not benefit from multimodal coreference resolution, and that aspect and emotion classification only requires textual information. However, when more specific information about the aspects is desired, image recognition could be essential.