Artificial and emotional intelligence meet in new autonomous driving tech
Face-reading artificial intelligence technology has shown up in identification and security use cases. What applications would result if such an AI could also tell when people are psyched, depressed or confused?
“There are only seven or six universal emotions, plus neutral,” said Modar Alaoui (pictured), founder and chief executive officer at Eyeris Technologies Inc., developers of emotion recognition software.
These emotions are akin to the seven primary colors; all variations of both colors and emotions derive from a basic set, Alaoui stated at this year’s When IoT Met AI: The Intelligence of Things conference in San Jose, California. They are hardwired into humans’ brains and show on their faces through microexpressions, he told Jeff Frick (@JeffFrick), host of theCUBE, SiliconANGLE Media’s mobile livestreaming studio. (* Disclosure below.)
Microexpressions often reveal what people can’t or won’t. “They can generally give up a lot of information as to whether a person has suppressed a certain emotion or not. Or whether they are thinking about something negatively before they can respond positively, etc.,” Alaoui said.
They can also help autonomous vehicles drive safely and cater to occupants’ likes and dislikes, according to Alaoui.
Backseat driver
Eyeris fuses with sensors to improve people’s experiences, Alaoui explained. It can tell if a driver is growing dangerously tired or careless.
Fully autonomous, driverless vehicles are still at least 10 years off, Alaoui predicted. But when they arrive, the focus of the software and services in the car will shift to the occupants. “All of these services will revolve around who is inside the vehicle by age, gender, emotion, activity, etc.,” he said.
Eyeris software will be hitting the road by early 2018. “We made some announcements earlier this year at CES [Consumer Electronics Show] with Toyota and Honda,” he concluded.
Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of When IoT Met AI: The Intelligence of Things. (* Disclosure: TheCUBE is a paid media partner for When IoT Met AI. Neither Western Digital Corp., the event sponsor, nor other sponsors have editorial influence on theCUBE or SiliconANGLE.)
Photo: SiliconANGLE
Since you’re here …
… We’d like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.
If you like the reporting, video interviews and other ad-free content here, please take a moment to check out a sample of the video content supported by our sponsors, tweet your support, and keep coming back to SiliconANGLE.