Emotions are related to many different parts of our lives: from the perception of the environment around us to different learning processes and natural communication. They have an important role when we talk to someone when we learn how to speak, when we meet a person for the first time, or to create memories of a certain experience in our childhood. Because of this crucial role in a human’s life, studies on emotions date from the first centuries of written history and until today it is a very popular research field involving a lot of different disciplines: from neuroscience and psychology to artificial intelligence and robotics.
The research field of affective computing introduces the use of different emotional concepts in computational systems. Imagine a robot which can recognize spontaneous expressions and learn with it how to behave in a certain situation, or yet it uses emotional information to learn how to perceive the world around it. This is among the hardest challenges in affective computing: how to integrate emotion concepts in artificial systems to improve the way they perform a task, like communication or learning. One of the most important aspects of affective computing is how to make computational systems recognize and learn emotion concepts from different experiences, for example in human communication. Although several types of research were done in this area in the past two decades, we are still far away from having a system which can perceive, recognize and learn emotion concepts in a satisfactory way.
This repository contains different solutions for emotion appraisal and artificial empathy. The models available in this repository use different computational concepts to solve each of these problems and implement solutions which proved to enhance the performance and generalization when recognizing emotion expressions.