Thursday, October 25, 2018

MixedEmotions: Open-Source Toolbox for Emotion Analysis in Texts, Audios and Vidoes

A European team of researchers, including Universidad Politécnica de Madrid, has developed an open-source toolbox to assess emotions in texts, audios and videos.

MixedEmotions Toolkit is a set of free-source tools that analyzes emotions. This toolbox is the result of MixedEmotions, a research project in which Universidad Politécnica de Madrid (UPM) was involved with a consortium of companies, universities and research centers from diverse European countries. The aim of this project is the automatic recognition of emotions through functionalities text, audio and video processing.

Credit: MixedEmotions project

Emotions are essential for our existence. Our actions are affected by both our mood and the way we perceive others. In this way, there is a growing demand to automatically analysis emotions in different fields. The application of this technology is wide, including call centers, smart environments, brand reputation analysis and assistive technology.

Firstly, analysis tools can be highly complex. Emotions detection may require certain previous analysis such as age, gender or facial recognition. Secondly, the analysis also requires both prior knowledge and linguistic resources which are not always public, and thirdly, these tools are usually designed in one language, generally in English.

Adapting these tool to other languages is an arduous task that requires specific resource on each language. The combination of these issues reduces the range of freely available tools.

The aim of MixedEmotions was to increase the number of these tools, this project was funded by the Horizon 2020 programme in which the Intelligent System Group (GSI) from UPM was involved. The developed tools are adapted to various European languages in order to recognize the multicultural and multilingual outlook of the current technology.

Credit: the Toolkit platform

In order to show the utility of these tools, researchers tested these tools in the context of three concrete use cases: a smart TV application that provides emotion-driven recommendations, a call center monitoring system, which assesses the mood and the reaction of the clients in each call and an online brand reputation system for companies that studies opinions and responses of customers.

The Intelligent System Group (GSI) from UPM has carried out diverse contributions to the project. Firstly, they led the linked data modeling for services and semantic vocabulary. As a result, all the project tools uses this type of vocabulary based on the linked data principles.

This eases the interoperability since analyses in various modalities are conducted using fusion techniques. Fernando Sánchez, a GSI researcher says “Given the relevance of this issue, we developed a community group on the World Wide Web consortium (W3C), an international community focused on the development of standards to ensure the long term Web growth, to discuss this modeling semantic and transfer the results”.

Secondly, the GSI group developed Senpy, a software that helps develop and publish services and tools of emotions analysis, mainly focused on text processing. Lastly, the group improved the emotions analysis through social context, that is, additional information about the user, the content and different relationships in the social networks.

The MixedEmotions project (No. 644632) was carried out by researchers from UPM, Ireland (National University of Ireland Galway y Siren Solutions), Germany (Deutsche Welle y University of Passau), Spain (Paradigma Digital), Italy (Expert Systems) y la Czech Republic (Phonexia y Brno University of Technology).
http://www.upm.es/internacional/UPM/UPM_Channel/Research_News?id=fb46d04fdd9a6610VgnVCM10000009c7648a____&fmt=detail&prefmt=articulo







Contacts and sources:     .
Universidad Politécnica de Madrid

Citation: MixedEmotions: An Open-Source Toolbox for Multimodal Emotion Analysis.     .Buitelaar, Paul; Wood, Ian D.; Negi, Sapna; Arcan, Mihael; McCrae, John P.; Abele, Andrejs; Robin, Cecile; Andryushechkin, Vladimir; Ziad, Housam; Sagha, Hesam; Schmitt, Maximilian; Schuller, Bjoern W.; Fernando Sanchez-Rada, J.; Iglesias, Carlos A.; Navarro, Carlos; Giefer, Andreas; Heise, Nicolaus; Masucci, Vincenzo; Danza, Francesco A.; Caterino, Ciro; Smrz, Pavel; Hradis, Michal; Povolny, Filip; Klimes, Marek; Matejka, Pavel; Tummarello, Giovanni.  IEEE TRANSACTIONS ON MULTIMEDIA 20 (9): 2454-2465. DOI: 10.1109/TMM.2018.2798287. SEP 2018.
Attached files

No comments:

Post a Comment