Good help is hard to find, but it’s getting easier with the arrival of virtual assistants. Already prevalent on mobile devices, in smart homes and website chat options, helpful bots like Apple’s Siri, Amazon’s Alexa and Microsoft’s Cortana have been assisting with various consumer needs. They can adjust home central heating, play music, provide answers to problems reported over a hotline or assist in selecting just the right product. With a little adjustment, they are now making their way into vehicle cabins around the world.
There’s plenty to help with in the car, from driving-related tasks like finding the optimal route with the optimal parking, to playing a certain type of music. Studies have flagged particularly strong interest in help with in-car diagnostics, such as reporting issues and helping drivers understand new car features. But with all of these, the effectiveness of the system and its impact on safety can only be realised if it is well designed and optimised for use in the car.
“With an increasing number of features and functions in the car, the automotive assistant can proactively help the driver explore car functions and solve driving-related tasks, thus taking some of the cognitive load from the driver. However, this presumes that the assistant meets several requirements,” explains Fatima Vital, Senior Director, Marketing Automotive at Nuance Communications. Nuance has emerged as one of the pioneers of voice recognition technology in vehicles. Its connected car platform, Dragon Drive, is embedded within a vehicle’s infotainment system and combines natural language understanding (NLU) and text-to-speech functionality.
“Consumers expect their automotive assistants to know them and their preferences and to be aware of their respective context and situation,” states Vital. For instance, the assistant must distinguish between driver and passenger in order to deliver personalised results. Nuance, and many others, believe the most convenient method for this is voice biometrics.
“With an increasing number of features and functions in the car, the automotive assistant can proactively help the driver explore car functions and solve driving-related tasks, thus taking some of the cognitive load from the driver. However, this presumes that the assistant meets several requirements” – Fatima Vital, Nuance Communications
Notably, the assistant has to learn user preferences from past behaviour, and artificial intelligence (AI) is pivotal in this sense. Access to these systems must be seamless, easy and non-distracting. NLU goes a long way in reducing the cognitive load on the driver. For example, the driver could request, ‘Find a good parking spot near my appointment’ and the system would interpret that as an instruction for a covered parking location that accepts credit cards near a specific office address.
Context is everything. An assistant must consider all relevant information before it can provide the best response to the driver. For example, when looking for the parking spot, relevant information includes the time, the duration of the appointment, the weather, supported payment methods, opening hours of the car park, size of the car, etc. “This information has to be part of the contextual reasoning of the automotive assistant and can play a critical role in reducing driver distraction and increasing road safety,” emphasises Vital.
New technology’s trickledown effect means these systems may have started out in the premium segment but they are proliferating across all the segments now, in the form of both OEM-controlled automotive assistants and access to third party assistants. Many car brands are keen to promote their own branded automotive assistants, as they attach great importance to the branding strategy. “These car brands want to be in control of the holistic user experience in the car, tightly integrating their brand and design concepts,” observes Vital. “We are also seeing that many of the largest brands both in the premium as well as in the mass market segments are not willing to give up control over the vehicle experience and their brand to brought-in third party assistants.” In these cases, they work with suppliers like Nuance to create their own branded assistants that can intelligently route to other virtual assistants whenever needed.
“If the car starts to listen to you and talk back, that changes. It alters the relationship between the space and the driver. It becomes less of a personal space and it becomes more of a public space” – Geraint Jones, Siegel+Gale
Alongside the likes of Nuance, Apple and Google, the segment is attracting considerable interest from start-ups. Berlin-based German Autolabs is combining advances in voice and gesture control technology with AI to develop its digital assistant for drivers. The company has promised an interoperable, scalable software platform for cognitive assistance with a retrofit hardware device to make an offering that anyone can access.
Start-up iNAGO is also building what it refers to as a next-generation conversational assistant for automotive applications. Its Intelligent Driver Assistant is designed to provide a safe way for drivers to access connected content and services in the car, as well as stay informed about their car and control car features. Other players working on these systems and their supporting technology include Baidu, Maluba, Sensory and VocalZoom.
In-car assistants could play a pivotal role as the industry moves towards greater automation. The transition period from today’s cars to fully autonomous vehicles will force driver and car to cooperate. “Conversational intelligent automotive assistants will play a crucial role in this,” predicts Vital. They could also help with establishing trust in these systems, one of the biggest potential challenges to acceptance. A recent AAA study in the US suggested that less than one-fifth of drivers would trust an autonomous vehicle. Studies from other regions have flagged similar hesitancy.
Nuance has been working with DFKI on this area, looking into the role of automotive assistants in the transfer of control between driver and self-driving vehicle technology. Researchers found that the majority of participants preferred integrated, multimodal user interfaces leveraging voice, touch and visual cues.
“Consumers expect their automotive assistants to know them and their preferences and to be aware of their respective context and situation” – Fatima Vital, Nuance Communications
However, there could be unintended consequences from a proliferation of the technology. With a focus on personalised service and continuous AI refinements, the digital assistant could very quickly become a digital companion. Today, the car has the status of a safe space and people act differently because of that. They sing loudly or they swear at other drivers, neither of which they would likely do if there were other companions. “If the car starts to listen to you and talk back, that changes,” observes Geraint Jones, Business Development Director at global branding company Siegel+Gale. “It alters the relationship between the space and the driver. It becomes less of a personal space and it becomes more of a public space.”
While the potential applications are clearly exciting, technical challenges could prove stubborn. “The challenge for the industry will be working out the quality issues, so that people don’t become frustrated,” warns Robert Guest, Vice President, Product Management at connectivity specialist Access. Even today, Amazon Alexa can struggle to understand commands with just the minor echo in a home kitchen. Road noise in a vehicle cabin is even harder to deal with.
Krishna Jayaraman, Program Manager – Connectivity & Telematics in Frost & Sullivan’s Automotive & Transportation practice, also has concerns about the quality of today’s digital assistants, noting: “Trying to bring that functionality inside the car becomes a difficult task.” One of the biggest challenges centres around the systems’ gradual learning curve – the more it is used, the more accurate it becomes. But what happens if people don’t use it enough? “Until and unless you feed inputs into that system, it will never learn,” Jayaraman tells Megatrends. There needs to be a willingness to use it often, he continues. “People need to be aware exactly how it is going to learn. A Google Assistant is not a 100% accurate product today, and it’s not accurate because people are not making it accurate. They need to put in that effort to make it more effective.”
Outside the car
Automotive assistants aren’t limited to the inside of the vehicle. Many companies are harnessing AI-backed assistants to help with customer service. “Virtual assistants are set to completely change the way that consumers engage with organisations,” predicts Nuance’s Seb Reeve, Director, Strategic Solutions, EMEA. “A big bugbear of most customers is the amount of time they have to wait to speak to a human agent – for simple questions and queries. Virtual assistants are proving themselves a fantastic investment for many organisations to deal with simpler tasks, freeing up the phone lines for human agents to more quickly respond to more complex or emotionally resonant tasks.”
“A Google Assistant is not a 100% accurate product today, and it’s not accurate because people are not making it accurate. They need to put in that effort to make it more effective” – Krishna Jayaraman, Frost & Sullivan
The trend is playing out across numerous industries, from banking to telecommunications to government agencies. In the automotive industry, Kia recently launched the chatbot Kian via its Facebook Messenger page. Harnessing tools that understand and can respond to natural language, Kian can answer questions about model specifications, pricing, financing, special offers, etc. Notably, he learns as he goes along. “As more shoppers use the system, it learns to anticipate questions, provides more specific answers and gets smarter over time,” explained Martin Schmitt, Chief Executive and Co-Founder of CarLabs, the specialist that worked with Kia in developing the technology. “Having access to that level of shopper data is a powerful tool for understanding and serving customers better than previously possible.”
Should you require assistants…
Digital virtual assistants dominated CES 2018, with voice activation and AI incorporated into the majority of the future technology on display.
Harman demonstrated its Digital Cockpit, which features, amongst other things, Samsung’s relatively new Bixby virtual assistant. Nuance unveiled a new “cognitive arbitrator” that combines conversational and cognitive AI capabilities to connect disparate virtual assistants and related functions and services; this would enable consumers to have different services in the home, on mobile devices and in their cars, and move seamlessly between them. And Nvidia showcased new technology that OEMs can use to develop highly advanced AI virtual assistants, which can use everything from facial recognition, mood analysis and distraction monitoring to location-based individualised content delivery.
Even at CES, however, the fallibility of the technology was clear, as LG’s VP of marketing discovered when the company’s new CHLOi home assistant robot failed during an on-stage demo. Clearly, the technology exists to enable a human driver in control of a vehicle to use virtual assistants to control infotainment and communication. However, it will be some time before a human can reliably converse with a digital assistant in control of a vehicle – just ask CHLOi.
This article appeared in the Q1 2018 issue of Automotive Megatrends Magazine.