“We are in bad shape when it comes to transportation. We have these metallic objects travelling really quickly with really high kinetic energy. We are putting meat in the control system; it is quite undesirable. It fundamentally comes down to people not being too good at driving. They get into a lot of trouble,” said Andrej Karpathy, senior director of AI at Tesla, at the CVPR 2021 event.
He made a case against the human driver, referring to them as ‘meat computers’. Karpathy said humans drive in a tight loop with 1-ton objects at 80 mph, have 250ms reaction latency, need to use mirrors for situational awareness, and have to dodge distractions. On the other hand, automation in transportation offers many benefits, including a tight control loop, quicker reaction latency (<100ms), and a 360-degree situational awareness. Such features drastically reduce accidents and transportation costs.
Tesla’s approach
Talking about Tesla’s approach to autonomy, Karpathy said the team is working towards full self-driving (FSD) capability. Tesla founder Elon Musk has spoken about his grand ambitions in this direction in the past. In fact, in 2020, Musk claimed the company was close to achieving the basic requirements of level five autonomy that requires no human driver input (Tesla, however, later issued a rebuttal saying the company is still at level 2).
AIM Daily XO
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.
Autopilot is Tesla’s suite of advanced driver-assistance technology offering features including lane centring, traffic-aware cruise control, self-parking, semi-autonomous navigation of roads, automatic lane changes, etc.
Hardware: The team is building silicon chips that power the self-driving software from the ground up. The goal is to optimise architectural and micro-architectural aspects to squeeze in maximum silicon performance per watt. Other functions include performing floor-planning, power analysis, writing robust tests and scoreboards to verify functionality, implement compilers and drivers to communicate with the chip, and productionisation.
Download our Mobile App
Neural networks: Tesla applies deep neural networks-based solutions to problems ranging from perception to control. The full build of the Autopilot neural network involves 48 networks trained over 70,000 GPU hours. Together these networks output 1,000 distinct predictions at each timestep. Tesla Vision is one of the applications of deep neural networks. Tesla Vision deconstructs the car’s environment at a far greater level than classical vision processing techniques.
The Hardware 3 onboard computer runs the Tesla-developed neural network that can process more than 40 times the data compared to previous generation systems. This provides a view that the human driver alone cannot access–seeing in different directions simultaneously and on wavelengths beyond human senses.
Last year, Musk announced the company was developing a neural network training computer, Dojo, to process vast amounts of video data. In the CVPR 2021 event, Karpathy introduced a predecessor for Dojo that he touted as one of the world’s fastest filesystem. He said the unnamed supercomputer has ‘ 720 nodes, each powered by eight of Nvidia’s A100 GPUs (the 80GB model), for a whopping 5,760 A100s throughout the system’.
Navigation: The Navigate feature on Autopilot helps in optimising the route, making adjustments so that the vehicle is automatically steered towards highway interchanges and exits based on the destination. The Autosteer+ feature helps in navigating complex roads using advanced cameras, sensors, and computing power. Smart Summon helps in manoeuvring the vehicle around obstacles to find the right parking spot.
Vision system: Karpathy said Tesla’s vision system is one of the best in the business. The cameras are doing most of the work in terms of perception, and the company is now planning to ditch sensors and move to a vision-only approach.
Evaluation infrastructure: Tesla builds open- and closed-loop, hardware-in-the-loop evaluation tools and infrastructure at scale, to drive innovation, track performance improvements and avoid regressions. Tesla’s systems leverage anonymised characteristic slips from the fleet and integrate them into large suites of test cases.