Cars Will Soon Be Able to Sense and React to Your Emotions – Singularity Hub

Imagine youre on your daily commute to work, driving along a crowded highway while trying to resist looking at your phone. Youre already a little stressed out because you didnt sleep well, woke up late, and have an important meeting in a couple hours, but you just dont feel like your best self.

Suddenly another car cuts you off, coming way too close to your front bumper as it changes lanes. Your already-simmering emotions leap into overdrive, and you lay on the horn and shout curses no one can hear.

Except someoneor, rather, somethingcan hear: your car. Hearing your angry words, aggressive tone, and raised voice, and seeing your furrowed brow, the onboard computer goes into soothe mode, as its been programmed to do when it detects that youre angry. It plays relaxing music at just the right volume, releases a puff of light lavender-scented essential oil, and maybe even says some meditative quotes to calm you down.

What do you thinkcreepy? Helpful? Awesome? Weird? Would you actually calm down, or get even more angry that a car is telling you what to do?

Scenarios like this (maybe without the lavender oil part) may not be imaginary for much longer, especially if companies working to integrate emotion-reading artificial intelligence into new cars have their way. And it wouldnt just be a matter of your car soothing you when youre upsetdepending what sort of regulations are enacted, the cars sensors, camera, and microphone could collect all kinds of data about you and sell it to third parties.

Just as AI systems can be trained to tell the difference between a picture of a dog and one of a cat, they can learn to differentiate between an angry tone of voice or facial expression and a happy one. In fact, theres a whole branch of machine intelligence devoted to creating systems that can recognize and react to human emotions; its called affective computing.

Emotion-reading AIs learn what different emotions look and sound like from large sets of labeled data; smile = happy, tears = sad, shouting = angry, and so on. The most sophisticated systems can likely even pick up on the micro-expressions that flash across our faces before we consciously have a chance to control them, as detailed by Daniel Goleman in his groundbreaking book Emotional Intelligence.

Affective computing company Affectiva, a spinoff from MIT Media Lab, says its algorithms are trained on 9.5 million face videos (videos of peoples faces as they do an activity, have a conversation, or react to stimuli) representing about 5 billion facial frames. Fascinatingly, Affectiva claims its software can even account for cultural differences in emotional expression (for example, its more normalized in Western cultures to be very emotionally expressive, whereas Asian cultures tend to favor stoicism and politeness), as well as gender differences.

As reported in Motherboard, companies like Affectiva, Cerence, Xperi, and Eyeris have plans in the works to partner with automakers and install emotion-reading AI systems in new cars. Regulations passed last year in Europe and a bill just introduced this month in the US senate are helping make the idea of driver monitoring less weird, mainly by emphasizing the safety benefits of preemptive warning systems for tired or distracted drivers (remember that part in the beginning about sneaking glances at your phone? Yeah, that).

Drowsiness and distraction cant really be called emotions, thoughso why are they being lumped under an umbrella that has a lot of other implications, including what many may consider an eerily Big Brother-esque violation of privacy?

Our emotions, in fact, are among the most private things about us, since we are the only ones who know their true nature.Weve developed the ability to hide and disguise our emotions, and this can be a useful skill at work, in relationships, and in scenarios that require negotiation or putting on a game face.

And I dont know about you, but Ive had more than one good cry in my car. Its kind of the perfect place for it; private, secluded, soundproof.

Putting systems into cars that can recognize and collect data about our emotions under the guise of preventing accidents due to the state of mind of being distracted or the physical state of being sleepy, then, seems a bit like a bait and switch.

European regulations will help keep driver data from being used for any purpose other than ensuring a safer ride. But the US is lagging behind on the privacy front, with car companies largely free from any enforceable laws that would keep them from using driver data as they please.

Affectiva lists the following as use cases for occupant monitoring in cars: personalizing content recommendations, providing alternate route recommendations, adapting environmental conditions like lighting and heating, and understanding user frustration with virtual assistants and designing those assistants to be emotion-aware so that theyre less frustrating.

Our phones already do the first two (though, granted, were not supposed to look at them while we drivebut most cars now let you use bluetooth to display your phones content on the dashboard), and the third is simply a matter of reaching a hand out to turn a dial or press a button. The last seems like a solution for a problem that wouldnt exist without said solution.

Despite how unnecessary and unsettling it may seem, though, emotion-reading AI isnt going away, in cars or other products and services where it might provide value.

Besides automotive AI, Affectiva also makes software for clients in the advertising space. With consent, the built-in camera on users laptops records them while they watch ads, gauging their emotional response, what kind of marketing is most likely to engage them, and how likely they are to buy a given product. Emotion-recognition tech is also being used or considered for use in mental health applications, call centers, fraud monitoring, and education, among others.

In a 2015 TED talk, Affectiva co-founder Rana El-Kaliouby told her audience that were living in a world increasingly devoid of emotion, and her goal was to bring emotions back into our digital experiences. Soon theyll be in our cars, too; whether the benefits will outweigh the costs remains to be seen.

Image Credit: Free-Photos from Pixabay

Here is the original post:

Cars Will Soon Be Able to Sense and React to Your Emotions - Singularity Hub

Related Posts

Comments are closed.