BOSTON, MA – Affectiva, a Boston-based firm specializing in artificial emotional intelligence, this week announced the launch of Affectiva Automotive AI. Billed as ‘a multi-modal in-cabin AI sensing solution’, Automotive AI uses facial and voice recognition to measure the emotional and cognitive states of a vehicle’s occupants.

The software promises to assess vehicle safety in real-time by measuring driver sobriety and fatigue levels, while also promising on-the-fly recommendation capabilities to provide passengers with a personalized, comfortable travel experience.

Following the product announcement, we caught up with Gabi Zijderveld, CMO for Affectiva, to find out what other players in the sector – and businesses across industry lines – can learn.

Why should mood and emotional reactions be a consideration for businesses building AI systems?

“Emotions influence all aspects of our lives: how we live, work and play, our decision making, communications and how we interact with technology.  Today we live in a world full of hyper-connected devices and advanced AI systems – all with massive cognitive capabilities, lots of IQ but not EQ.  We believe that our interactions with technologies that are not emotion-aware are superficial and ineffective at best, and we are bringing this emotional intelligence to technology.  This merger of IQ and EQ is inevitable – especially in automotive.”

“OEMs and Tier 1 suppliers have a critical need to understand what is going on with the people in a vehicle.  By measuring complex emotional and cognitive states of face and voice, Affectiva Automotive AI provides people analytics that enables car makers to create a personalized and differentiated transportation experience, that can help them stand out in a highly competitive market.”


robot-3010309
Related: Tears in Rain – Can Emotion AI Transform Customer Care?


What are the implications of Automotive AI for the autonomous vehicle sector? Could it help improve the safety of AVs? 

“The  transformation of the in-cabin experience. In a future of autonomous robo-taxis, it will be important to monitor what’s going on inside the vehicle, given that it’s likely you are getting into a car with a stranger(s). Are passengers safe sharing rides with others?  Can providers intervene if passengers get into a conflict? These are all questions that will need to be addressed, and require an understanding of what is happening with the people inside a vehicle.”

“For example, in ride sharing, it is key to understand if your passengers are enjoying the ride.  Consumers will choose a ridesharing service that offers them the best experience.  In a future of autonomous vehicles, passengers become a captive audience in an infotainment hub.  Affectiva’s AI technology can fuel personalized recommendations for music, environment, or stops along the way, based on an occupant’s mood or reactions to the journey.  Vehicles will have to adapt to their customers’ needs: do they want to relax, work, socialize or consume content when they are in the vehicle. The car could be designed to maximize productivity by adjusting the mood or ambiance to create a productive working environment for those on the go.”

How can the insights generated by Automotive AI lead to actionable improvements in the road safety of connected cars?

“Affectiva Automotive AI will improve road safety by monitoring driver cognitive and emotional states to assess in real-time, driver impairment caused by drowsiness, physical or mental distraction. In semi-autonomous vehicles, our solution can help control the hand-off between a human driver and the vehicle.  If a human driver is drowsy, the vehicle might suggest to take control.  Conversely, the autonomous system will also need to determine whether a driver is awake and paying attention, so he/she can safely take back control of the vehicle.”

What are the dangers of the Automotive AI solution misreading driver state?

“First and foremost we are developing our Automotive AI solution for high accuracy. Real-world data is critical in developing our algorithms.  We have already analyzed 6.5 million faces in 87 countries, and have now also collected 2,100 hours of driver data which represents about 42,000 miles driven.”

“That being said, we do not expect, nor do we recommend that automotive safety systems are ever designed using just one type of data.  Highly automated vehicles or autonomous systems  will rely on a number of data points, with a critical one being the Affectiva Automotive AI.  Our technology will be used with other sensors and data (such as for example lane drift, speed variability, steering wheel movement or steering wheel grip, traffic patterns etc.)”

“How best to leverage critical data to realize a transportation experience that is both safe and enjoyable for a vehicle’s occupants is an ongoing conversation in the industry, and one we have with all our automotive partners.”


0

 

Gabi Zijderveld is chief marketing officer at Affectiva. Gabi also leads Affectiva’s product strategy to deliver on the company’s multi-modal vision. She has over 20 years of product management and international experience at leading tech companies including Dragon Systems and IBM.