The quest of humans to soon unveil a world full of automation is an inspiring tale to share, but at the same time, it is important to understand the hurdles on the way and tackle them efficiently. It is unwise and unjust to launch a product with the tag of automation when the operation is designed to perform partially supervised and at times, manually controlled by humans.
Similar is the tale of Tesla whose autopilot vehicles have stolen the limelight in the recent past years, mostly for negative reasons. In this article, we shall evaluate a few cases of accidents in the last few years such as the accident of an Apple engineer and come to an understanding of whether the autopilot system of Tesla was alone at fault or drivers too took some wrong steps, which caused the accidents. Most importantly, is autopilot developed enough to be used on the real streets?
Accidents To Look At
There are two cases of accidents to look at, and both of them took place in the year 2018. The first incident involved an engineer from Apple by the name of Walter Huang, who was driving a Tesla Model X SUV that crashed into a concrete barrier at a speed of 114km\hr as reported by the NTSB team. NTSB further sheds light on the matter and reports that one of the major cause of the accident was the failure of Tesla’s system, which failed to recognise a freeway exit. Also, the forward collision system failed to alert Huang, and the automatic emergency braking did not activate.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.
The second one involved a 2014 Tesla Model S P85, which collided with a firetruck in Culver city of California. During the time of the accident, the Autopilot system was engaged, but it failed to take an immediate decision on the spot to avoid the accident. The car identified a pick-up truck on the front and was trailing it. However, when the pick-up truck steered right to avoid a parked firetruck vehicle, the P85 failed to identify the next incoming object (firetruck) and straightway rammed into it. Luckily, the driver escaped with minor injuries as stated by a detailed report of NTSB.
Taking into consideration all the fallouts of Tesla’s system, the question still stands if the system alone is responsible or not.
Understanding Tesla’s Advanced Driver Assistance System (ADAS)
The ADAS technology inside a Tesla car includes a number of features such as autopilot, which further consist of traffic-aware cruise control (TACC), auto steer and auto lane change. It houses a forward-facing camera, radar and a number of sensors, which evaluates the vehicles surrounding.
Through extensive research, Tesla has trained deep neural networks, which analyses raw images to perform semantic segmentation, object detection and monocular depth estimation. To turn the autopilot dream into reality, Tesla has created algorithmically accurate data by combining a car’s data over time. The cars with autopilot from Tesla are trained to learn from complicated scenarios that have been undergone by a fleet of more than one million vehicles. The ADAS also comes with other features such as forward collision, automatic emergency braking, lane assist, speed assist and auto parking.
Along with equipping these systems, Tesla did set out a number of guidelines on how to use them and understand what scenarios will the systems come online.
The TACC allows a vehicle to maintain a certain speed based on the information from the camera and also detects if there is any vehicle on the front. In case, a vehicle is detected, which is running a slower speed, the vehicle automatically adjusts the speed and maintains a distance from the vehicle in front. The owner manual of a car clearly states:
Moving on to Autosteer, the system is meant to keep a vehicle inside a lane and cannot operate without activating TACC since it uses the camera, radar and ultrasonic sensors to identify lane markings. Even in regard to Autosteer feature, Tesla has provided with warnings which state:
What Caused The Accidents
In the first accident, it found that the driver’s hand was not on the steering wheel as has been advised by Tesla, which could have avoided the accident. On the technology side, the autopilot system failed to react quickly to a changing situation. The Autosteer system, which works with the help of TACC could only detect the SUV truck, in the front, changing lane and increased its speed since the distance between the SUV and the Tesla increased. But within split seconds, it failed to recognise the next object on the pathway and thus crashed into it.
In the second accident, the systems failed to identify the freeway and exit lanes and thus rammed into the concrete barrier. The failure to identify could be a result of lack of training of neural networks to distinguish between two different routes and intersection points.
Furthermore, it sped up and rammed the barrier because the system could not identify the barrier. Due to this reason, the automated emergency brakes were not applied. However, in a report by a media firm, a shocking revelation was made by the team NTSB, who stated that Wuang was playing a video game on his smartphone. Also, as per the data retrieved from the vehicle, manual braking and steering movement was not applied by Wuang in the final moments of the accident. In short, the guidelines by Tesla in regard to autopilot systems were ignored.
Final Word
At the beginning of the article, it was mentioned how any company could term a product as automated when it has to be partially controlled by humans. A similar statement was made by the Chairman of NTSB, Robert Sumwalt who said, “If you own a car with partial automation, you do not own a self-driving car.” Sumwalt said in opening statements. “This means that when driving in the supposed ‘self-driving’ mode, you cannot read a book, you cannot watch a movie or TV show, you cannot text and you cannot play video games.”
Last but not least, autopilots, which cannot fully go automatic are not good enough to be deployed on the streets since a human might become dependant without fully understanding the danger and the limitation of the vehicle, as it happened with the two drivers. A report published by the Fortune on February 26, 2020, also stated that the real reason behind the accidents is not just the vehicle but a driver’s distraction. Hence, it is either best to avoid autopilot vehicles until they are fully automated or drive the ones with caution as one would do with a normal conventional vehicle.