When someone mentions Artificial Intelligence and cars together, the first thing that pops in mind is self-driving cars. With giants like Google, Uber and now Apple, showing their interest in self-driving cars and letting the public know that these ‘cars of the future’ are coming has everyone eagerly waiting for them.
Well, they are here but, one might wonder- are they ready to take on the roads, especially Indian roads? And when are they rolling out? Next year? A year after that? Try years. While we wait for cars without drivers, technology is exploring the idea of integrating A.I. inside the car. A car that does not drive itself but is aware of its passengers sitting inside.
What’s inside the car’ means is using A.I. to understand what is happening with the person driving the car. As of now, A.I.’s algorithms for self-driving cars can perform limited tasks which aren’t suitable because of the open environment, but the algorithms can perform better within the restricted area inside the vehicle. As important as it is for A.I. to understand what is happening inside a driverless car, A.I.’s services inside the car with a driver is also needed.
A.I. To Identify The Rightful Owner
In-car cameras, through their complex vision algorithms, can perform complex tasks — one such task is detecting the rightful owners of the car. Chooch, an artificial intelligence training platform and API for facial recognition, is developing a facial recognition system to identify the rightful owners of the vehicles.
This ability of A.I. can match faces with identity cards within the vehicle. This feature is vital in the near future when it comes to the safety of the cars. This ability of facial recognition could help the car rental companies along with private owners. For example, if you are renting a car, as a security check, you hold up your I.D. proof and the facial recognition system matches your face to give you access to the car. The match ensures that the correct person is sitting behind the wheel.
Being Aware Of The Car’s Environment
A.I. system inside the car is essentially ‘aware’ of the objects inside it. The A.I. algorithms for the car regulate the car’s environment which applies to auto-adjusting interior lights and locking the doors and changing the volume of the music during a phone call or in the case of dangerous driving conditions.
The doors on the backseat auto-lock when it detects kids in the backseat or changes the channel or radio to the kid’s programme. The car can also slow down or can be alerted in case of loud threatening voices or cursing. Functions like these and A.I.’s algorithms can also monitor the backseat after you leave the car, then notify if you’ve left something important behind.
R.E.A.D. (Real-time Emotion Adaptive Driving System):
During the 2019 Consumer Electronic Show, Kia introduced the Real-time Emotion Adaptive Driving System (R.E.A.D.) technology. Kia stated that the AI-based system could tailor vehicle interiors to a passenger’s emotional state. It does so by using sensors to monitor facial expressions, heart rate and E.D.A. (Electrodermal Activity) of the passengers.
R.E.A.D. System also includes seat vibrations, and these aren’t just some normal vibrations, these vibrations in the seat match their frequency to whatever kind of music is being played inside the car (So steer clear of Heavy Metal Music).
“Emotion A.I. can provide an understanding of people’s preferences and optimise the in-cabin environment to offer a personalised experience,” El Kaliouby, co-founder and C.E.O. of Affectiva says.
Object And Emotion Detection Technology
Affectiva, an emotion measurement technology company that came out of M.I.T. Media Lab, has developed an A.I. system that can detect emotions and expressions in the human driving the car. This kind of technology will roll out in the coming two to three years. This technology makes use of a camera which is near the steering wheel. This camera monitors the driver’s behavioural patterns such as frequency and length of blinking eyes to determine whether the driver is feeling drowsy.
If there is some problem with the behavioural patterns, then it notifies the driver by changing the temperature, playing music or pulling over. The A.I. also is being used to detect distractions, for example, when drivers are eating or distracted by the phone.
That Feeling Of Insecurity
Whenever there is mention of A.I., there is a fear of the risk of privacy. But, with facial recognition, there arises another challenge with A.I. algorithm, algorithm bias. If a deep learning algorithm is trained on too many dark-toned humans, then the algorithm will be less accurate in detecting people with white colour skin tone. For the technology to work globally, these problems with algorithm bias have to be addressed, and these can only be reduced if more diverse data is put through the algorithm. Companies like Affectiva have analysed around 8.5 million faces in 87 countries to counter the algorithm bias.
Building an A.I. system which monitors and determines the activity inside the vehicle needs to have ample amount of user data. What happens with A.I. algorithm training is that they have to be fed a massive amount of user data for them to learn. The controversies like Amazon’s Alexa, accidentally recording a private conversation makes people anxious about the technology.
With concerns like this, one of the solutions to emerge is edge A.I. which makes the need for sending the data to the cloud unnecessary. Instead, it runs algorithms locally without a link to the cloud.
With India making rapid advances towards A.I., the idea of self-driving cars directs people’s attention to one of the crucial things- Job security. India will lose around 23% of the jobs that will be lost if there is complete automation globally by 2021, according to research by Human Resources (H.R.) solutions firm PeopleStrong. But, we need not worry because of many driverless cars around the world are partially self-driving or equipped with the Level 5 automation that implies full automation in all conditions as defined by SAE International (an association which develops global standards for mobility industry).
With practical self-driving cars far away and the reach of the algorithms limited to a certain extent, for now, maybe our best shot at automation is making use of whatever A.I. technology is available now. But, moving forward, if A.I. is going to help us in automating with something as personal as our cars, then even if there is a cushion of more safety, transparency is going to play a pivotal role as it always has in the matter of A.I.