Could Microsoft’s in-car AI for driverless vehicles make us all safer and more equal?

The age of driverless cars is fast approaching. Autonomous vehicles will soon be carrying passengers of every shape, size and ability and Microsoft believes that artificial intelligence (AI) will be as important on the inside of every driverless car as it is on the outside…

Keeping up with driverless cars

Autonomous technology is rapidly reaching maturity. Not a day goes by when a simple but powerful search using Google’s news-aggregating feature doesn’t bring up a bootful of stories charting the advance of autonomous tech. At the time of writing, there were twenty-six articles.

inside of automated, futuristic car

Here is the link to what I see when using the Google news service with ‘autonomous’ as the keyword. obviously the list will change from day to day and even hour to hour. And of course I have those articles come direct to me, minus ads, using the power of RSS.

The three main trends I see are:

  1. That driverless cars are getting better at coping with complex environments and inhospitable weather conditions
  2. Authorities at both the national and city level are increasingly active in roadtest trials in their localities
  3. The date that keeps being mentioned around when we’ll see them in numbers on our streets is only three short years away: 2020.

Artificial intelligence – inside and out

Despite the significant coverage of progress regarding the ability for driverless cars to cope with a wide range of daily traffic and pedestrian hazards, until now, one aspect of the smarts behind autonomous vehicles has been absent from the discussion, namely how AI can be used on the inside of the vehicle to help passengers have the best, and safest, journey possible.

Now Microsoft has begun to explore this next frontier of autonomous functionality in a recent presentation at tech event DesignCon 2017.

As a result, the conversation has now begun around how autonomous cars of the future will need as many smarts on the inside to monitor and assist their passengers as they do on the outside to have a better chance of safely getting them where they need to go.

Adding in-car AI to assist every passenger

In his talk at DesignCon 2017 ‘The Internet of Things That Move - Connected Cars & Beyond’, Microsoft’s Principal programme manager of the Azure group, Doug Seven, outlined several cases where internal sensors and applied intelligence have the potential to improve the in-car experience and may even save lives.

Using their Microsoft Cognitive Toolkit, which is open-source and assists in creating applications capable of deep learning - combined with the Microsoft Cognitive Services API, which includes an Emotion API capable of understanding the emotions of someone's face - the car could be far more aware of what passengers are doing and even feeling. The Emotion API can understand emotions including anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise.

“If we could detect things like road rage or stress, we could [have a vehicle] do things to alter the environment for the driver or passengers,” Seven said.

Cars in the cloud?

At present the Cognitive API and Cognitive Toolkit are cloud-based, which won't suffice for self-driving cars.

“What we can do in the Cloud with our AI capabilities is to build algorithms and models to let cars become intelligent and make decisions,” Seven told the DesignCon audience. “But we can't rely on the Cloud because we might lose connectivity or there might be (latent) issues."

woman in car looking like she has road rage

Microsoft plans to create localised AI within the cars themselves that will be more aware of drivers, passengers, and the car's environment. Seven said, “We need hardware in the car capable of processing that data in real time.”

The benefits of AI from within a driverless car

Seven gave a specific example of how such smarts could be put to use, explaining how instances of road rage could be avoided when the car [through the built-in cameras and emotion recognition algorithms] is aware of the emotions and motions of its passengers. But, it’s possible to think of many more scenarios where an empathetic and vigilant car could help make journeys more pleasant and potentially even save lives.

Many of us love the convenience of talking to Siri or Amazon’s Echo, and having the ear of your in-car AI has obvious applications. Being a passenger in a fully autonomous vehicle means that you could be eating lunch, engaged in work or any other type of activity you chose. If your hands are full and you need to change your route or destination (or just the song or video playing on the entertainment system) then voice is an easy way of doing it.

Feeling ill? With a word the car can pull over or else take you straight to the nearest pharmacy. And of course you would also have all the information available on the internet accessible through a natural, conversational interface.

How far can a smart car go?

However, a truly smart car can go a lot further than that. Once the AI has access to data from other sensors such as heart rate monitors built into armrests (or connected fitness trackers worn by passengers) then someone experiencing an irregular cardiac rhythm could be alerted to take their medication, or indeed a passenger who is unconscious after a heart attack or seizure could be taken directly to hospital.

Advanced sound detection algorithms could also be employed to listen for sounds of distress or alarm from passengers, as well as for external sounds of beeping horns or emergency vehicles that might require the car to make a sharp evasive manoeuvre. The AI could warn or reassure the passenger. For travellers with a hearing impairment who do not hear the sirens, or for blind passengers who’d appreciate an explanation for the drastic swerving, this level of assurance would be invaluable.

In a driverless future disability fades from view

In a future where autonomous vehicles not only need to navigate their way through complex and ever-changing streets, but also protect and interact with occupants regardless of what they happen to be doing throughout the journey, the car will need to be aware of their needs and able to decide upon the best method to communicate with its precious cargo at any given time.

If passengers don’t respond to a verbal prompt, it might be because they are listening to loud music on almost invisible earbuds, or it might be that they have hearing loss. If their eyes are closed, they may be relaxing yet still alert, fast asleep or perhaps it’s because they’re blind (or blinded by the sun). 

The car will have the sensors and the smarts to deal with such cases - along with numerous others. When in-car AI reaches this level of awareness there will be little distinction between disability, difference and distraction in a driverless world.

Read more