A booming field, speech and language recognition technology has led, for example, to the emergence of devices such as the virtual assistants Alexa and Siri. An important step in the evolution of dialogue artificial intelligence (AI) systems such as these is the addition of “emotional intelligence”, i.e. the ability to recognize the psychological states of those who access it.
A system with this characteristic, in addition to understanding the language, would generate a more empathetic response, providing a more immersive experience for the user. Known as “multimodal sentiment analysis,” a group of methods that are the gold standard for an AI dialogue system can successfully detect emotions.
These methods are able to automatically analyze a person’s psychological state from their speech, tone of voice, facial expression and posture and are crucial for human-centric AI systems. .
According to Techxplore, the technique could potentially create an emotionally intelligent AI with beyond-human capabilities that understands a user’s sentiment and generates a response accordingly.
However, current methods for estimating emotions focus only on observable information and do not explain information present in unobservable signals, such as physiological signals. Such cues are a potential goldmine of emotions that could significantly improve sentiment estimation performance.
Japanese researchers are testing an AI system capable of identifying physiological signals
In a new study, published this month in the journal IEEE Transactions on Affective Computing, physiological cues were first added to multimodal sentiment analysis by researchers in Japan. The collaborative team responsible for the new approach is made up of Shogo Okada, associate professor at the Japan Advanced Institute of Science and Technology (JAIST), and Kazunori Komatani, professor at the Institute of Scientific and Industrial Research at the University of Osaka.
“Humans are very good at hiding their feelings. A user’s internal emotional state is not always accurately reflected by the content of the dialogue, but since it is difficult for a person to consciously control their biological signals, such as heart rate, it can be helpful to use to estimate his emotional state. This could create an AI with sentiment estimation capabilities that go beyond the human,” says Okada.
The team analyzed 2,468 exchanges with a dialogue AI obtained from 26 participants to estimate the level of pleasure felt by the user during the conversation. The user was then asked to rate how enjoyable or boring they found the conversation. The team used the multimodal dialogue dataset called “Hazumi1911”, which uniquely combined voice recognition, voice tone sensors, facial expression and posture detection with the potential of skin, a form of physiological response detection.
“Comparing all the separate data sources, biosignal information was found to be more effective than voice and facial expression,” Okada said. “When we combined language information with biosignal information to estimate self-assessed internal state while talking to the system, AI performance became comparable to that of a human. »
These results suggest that detecting physiological signals in humans, which normally remain hidden from our view, could pave the way for highly sensitive AI-based dialogue systems, making human-computer interactions more natural and satisfying.
Additionally, emotionally intelligent artificial intelligence systems could help identify and monitor mental disorders, by detecting a change in everyday emotional states. They could also be useful in education, to gauge whether the student is interested and excited about a topic of discussion, or bored, leading to changes in teaching strategy and educational services. more efficient.
Have you watched our new videos on Youtube? Subscribe to our channel!
We would like to say thanks to the author of this write-up for this remarkable content
A study proposes an artificial intelligence system capable of recognizing the emotional states of the user
You can find our social media accounts as well as other pages related to it.https://www.ai-magazine.com/related-pages/