Will cars be able to sense and react to your emotions? By Vanessa Bates Ramirez, Singularity Hub
Imagine you are on your daily commute to work, driving along a crowded highway while trying to resist looking at your phone. You are already a little stressed out because you did not sleep well, woke up late, and have an important meeting in a couple hours, but you just don’t feel like your best self.
Suddenly another car cuts you off, coming way too close to your front bumper as it changes lanes. Your already-simmering emotions leap into overdrive, and you lay on the horn and shout curses no one can hear.
Except someone—or, rather, something—can hear: your car. Hearing your angry words, aggressive tone, and raised voice, and seeing your furrowed brow, the onboard computer goes into ‘soothe’ mode, as it is programmed to do when it detects that you’re angry.
It plays relaxing music at just the right volume, releases a puff of light lavender-scented essential oil, and maybe even says some meditative quotes to calm you down.
What do you think—creepy? Helpful? Awesome? Weird? Would you actually calm down, or get even more angry that a car is telling you what to do?
Scenarios like this (maybe without the lavender oil part) may not be imaginary for much longer, especially if companies working to integrate emotion-reading artificial intelligence into new cars have their way.
And it would not just be a matter of your car soothing you when you are upset—depending what sort of regulations are enacted, the car’s sensors, camera, and microphone could collect all kinds of data about you and sell it to third parties.
Computers And Feelings
Just as AI systems can be trained to tell the difference between a picture of a dog and one of a cat, they can learn to differentiate between an angry tone of voice or facial expression and a happy one.
In fact, there is a whole branch of machine intelligence devoted to creating systems that can recognise and react to human emotions; it is called affective computing.
Emotion-reading AIs learn what different emotions look and sound like from large sets of labeled data; ‘smile = happy,’ ‘tears = sad,’ ‘shouting = angry,’ and so on. The most sophisticated systems can likely even pick up on the micro-expressions that flash across our faces before we consciously have a chance to control them, as detailed by Daniel Goleman in his groundbreaking book Emotional Intelligence.
Affective computing company Affectiva, a spinoff from MIT Media Lab, says its algorithms are trained on 9.5 million face videos (videos of people’s faces as they do an activity, have a conversation, or react to stimuli) representing about five billion facial frames.
Fascinatingly, Affectiva claims its software can even account for cultural differences in emotional expression (for example, it is more normalised in Western cultures to be very emotionally expressive, whereas Asian cultures tend to favor stoicism and politeness), as well as gender differences.
As reported in Motherboard, companies like Affectiva, Cerence, Xperi, and Eyeris have plans in the works to partner with automakers and install emotion-reading AI systems in new cars.
Regulations passed last year in Europe and a bill just introduced this month in the US senate are helping make the idea of ‘driver monitoring’ less weird, mainly by emphasising the safety benefits of preemptive warning systems for tired or distracted drivers (remember that part in the beginning about sneaking glances at your phone? Yeah, that).
Drowsiness and distraction cannot really be called emotions, though—so why are they being lumped under an umbrella that has a lot of other implications, including what many may consider an eerily Big Brother-esque violation of privacy?
Our emotions, in fact, are among the most private things about us, since we are the only ones who know their true nature.
We have developed the ability to hide and disguise our emotions, and this can be a useful skill at work, in relationships, and in scenarios that require negotiation or putting on a game face.
And I do not know about you, but I have had more than one good cry in my car. It is kind of the perfect place for it; private, secluded, soundproof.
Putting systems into cars that can recognise and collect data about our emotions under the guise of preventing accidents due to the state of mind of being distracted or the physical state of being sleepy, then, seems a bit like a bait and switch.
A Highway To Privacy Invasion?
European regulations will help keep driver data from being used for any purpose other than ensuring a safer ride.
But the US is lagging behind on the privacy front, with car companies largely free from any enforceable laws that would keep them from using driver data as they please.
Affectiva lists the following as use cases for occupant monitoring in cars: personalizing content recommendations, providing alternate route recommendations, adapting environmental conditions like lighting and heating, and understanding user frustration with virtual assistants and designing those assistants to be emotion-aware so that they’re less frustrating.
Our phones already do the first two (though, granted, we are not supposed to look at them while we drive—but most cars now let you use bluetooth to display your phone’s content on the dashboard), and the third is simply a matter of reaching a hand out to turn a dial or press a button.
The last seems like a solution for a problem that would not exist without said… solution.
Despite how unnecessary and unsettling it may seem, though, emotion-reading AI is not going away, in cars or other products and services where it might provide value.
Besides automotive AI, Affectiva also makes software for clients in the advertising space. With consent, the built-in camera on users’ laptops records them while they watch ads, gauging their emotional response, what kind of marketing is most likely to engage them, and how likely they are to buy a given product.
Emotion-recognition tech is also being used or considered for use in mental health applications, call centres, fraud monitoring, and education, among others.
In a 2015 TED talk, Affectiva co-founder Rana El-Kaliouby told her audience that we’re living in a world increasingly devoid of emotion, and her goal was to bring emotions back into our digital experiences.
Soon they will be in our cars, too; whether the benefits will outweigh the costs remains to be seen.
CLICK HERE FOR LATEST NEWS.
READ CURRENT AND PAST ISSUES OF IAA.
KEEP YOURSELF UPDATED, SUBSCRIBE TO IAA NOW!