For now, being mad at your computer is pointless, but what if your software could track your mood? Affective computing allows a computer to detect and interpret your emotional state (affect) and use it as a form of input.
Artificial intelligence (affective)
In 1995, Rosalind Picard published an article and a book describing the fundamentals of affective computing. The idea is to endow computers with emotional intelligence (EQ) in addition to the analytical intelligence that makes them so useful.
Affective computing allows a computer system to analyze a human being’s emotional indicators, such as facial expression, tone of voice, body language, and words, to gain insight into their mental state.
Once the computer is sure what its user is feeling, it reacts in a way that is (hopefully) beneficial to the user. There are many ways computers use this information.
Do you remember Clippy, the Microsoft Office assistant? Imagine Clippy being able to tell when you’re really frustrated and only showing up when you really need help, not when you’re just trying to get your job done.
Affective computing could even be used very effectively in games, virtual reality applications, or when interacting with natural computer interfaces such as Siri.
UNESCO ADVANCES “UNIVERSAL DECLARATION” ON ARTIFICIAL INTELLIGENCE
Computers are getting good at faces
Humans express their emotions in a variety of ways, but our face is the main canvas on which we paint our feelings for the world to see. Even the best poker face can’t hide tiny micro-expressions, even if you still don’t know how to interpret them.
When the first paper on affective computing was written, the challenge of getting a computer to recognize and interpret a human face was truly daunting. Today we have efficient machine learning hardware in our gadgets that can recognize and map a face in fractions of a second.
Of course, it takes more than just the ability to recognize and map a face to derive affective information from it, but at least we can now get the raw facial information with relative ease. This same machine learning technology, combined with masses of facial data, will likely extract the most important emotional information we need for affective computing to work well.
We treat our computers like human beings
Computer interfaces look more and more like us every day. Living things like humans take millions of years to change, but our computers evolve and improve at lightning speed.
In the beginning, simple computers required us to adapt to them using punch cards, cryptic computer language, command prompts, and ultimately today’s graphical user interfaces. Touchscreens have helped make computers easier for everyone to pick up and use, as they translate our innate spatial intelligence into a digital format.
Today, computers are powerful enough to understand natural speech. You are more likely to be dealing with a virtual agent when asking for help or information. We have voice assistants everywhere.
As computer interfaces become more intuitive and natural, adding emotional information to this interaction could transform how these interfaces work.
Emotions are difficult for humans too
Although we evolved to understand and express emotions, humans get things wrong all the time. While some people seem to have an almost supernatural level of emotional intelligence, for most people this is still a complex task.
So while affective computing sounds like a great idea on paper, in practice it’s not that simple. Even with all the amazing new technology we have. It is reasonable to think that the first systems that will use this approach in the general public will focus on a small set of gross emotional expressions.
If your computer knows you’re exhausted, it might suggest you take a break. If it knows that certain images in your wallpaper slideshow make you happier than others, it might put them on high rotation or add other similar images.
It’s clear that affective computing can bring us many benefits, but don’t expect it to be perfect from day one!
The Dark Side of Affective Computing
Affective computing represents an important advance in the way people interact with machines, but it also opens the way to new forms of exploitation.
Marketing psychology is already adept at manipulating our emotions to alter our purchasing behavior. This is why an advertisement for a car emphasizes the sensations it provides rather than its power or fuel efficiency.
A large part of our decisions are made on the basis of emotion. So imagine if social media companies could read your emotional reaction to posts or advertisements. One day, you may need to press an “emotional analysis permissions” button in addition to those that allow the use of your camera or microphone.
We would like to say thanks to the author of this post for this remarkable content
Affective computing is changing the future of interacting with computers
Check out our social media profiles and also other related pageshttps://www.ai-magazine.com/related-pages/