Listen to this story
Tesla is all set to launch its Full Self Driving (FSD) Beta 10.69 on August 20. A few days ago, Elon Musk, CEO and chief architect at Tesla, announced some major updates on FSD. He said that FSD Beta 10.13 would be released as version 10.69 and there will be several improvements to the same. In the shareholders’ meeting, Musk assured that people can avail FSD this year.
With the AI day at Tesla due on September 30, expectations are mounting on the company to come up with the progress made with respect to FSD. For Tesla, there seems to be too much work at hand. After all, a product release is not a trivial affair, let alone cutting-edge technological products like Tesla FSD Models. However, last week a report on the failure of Tesla’s FSD technology started doing the rounds and has put a question mark on Tesla’s FSD.
Tesla’s latest FSD model put to test
The scientific test was conducted in June 2022 at Willow Springs International Raceway in Rosamond, California, by a professional test driver on a professional driving course. Cruising at an average speed of 25kmph, a Tesla 2019 Model 3 with FSD Beta 10.12.2 (the most updated software version available then) failed to detect the presence of a child-sized mannequin on the road and repeatedly hit it in a way that would be fatal.
The results contradict Tesla’s long-standing claims that its FSD technology is safe. In 2021 Elon Musk, CEO and product architect at Tesla, publicly claimed on multiple occasions that Tesla’s FSD technology is one of the best in terms of safety standards.
In fact, in an interview with the Financial Times last year, Musk claimed to be the CEO who cared about the safety of the planet the most.
A dangerous software
Calling the FSD software “the most dangerous commercial software ever released onto public roads”, the Dawn Project has demanded that the product be taken off the market. It is asking people to sign a petition that may eventually lead the US Congress to call it off.
Tesla in a mess
The safety test result seems to have put Tesla in a mess shortly before its big day.
Moreover, the National Highway Traffic Safety Administration (NHTSA) is undertaking two investigations into Tesla cars. In June, the NHTSA expanded an ongoing preliminary investigation (PI) to an engineering analysis (EA) into Tesla’s autopilot system comprising 830,000 Tesla cars across four models – Model Y, Model X, Model S, Model 3.
The initial investigation was opened in August 2021, backed by several crashes in which Tesla vehicles, operating with Autopilot engaged, struck stationary in-road or roadside first responder vehicles tending to pre-existing collision scenes. Now, the EA seeks to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioural safety risks by undermining the effectiveness of the driver’s supervision.
The NHTSA is undertaking another investigation into the “phantom braking” problem in Tesla cars. The problem cropped up after Tesla decided to do away with the forward-looking radar sensor from its Model 3 and Model Y. Thereafter, these models have been relying entirely on cameras for Advanced Driver Assistance Systems. However, certain issues came up – the issue of automatic emergency braking function, to be specific. In the newer radar-less models, automatic braking is being activated very frequently due to false positives like the shadow of a car. The NHTSA received several complaints related to this issue that drove them to carry out this investigation.
The investigators undertook a preliminary evaluation of Tesla’s Passenger Play feature that allowed passengers to play games on the infotainment display while the car is in motion. However, following the opening of the PI, Tesla came up with a software update that disabled the feature while the vehicle is in motion.
With several investigations underway, a test result of this sort seems to have put the Tesla house out of order.
The “staged” test
Well, a report of this sort has definitely provided the opponents of autonomous vehicles to strengthen their voices. However, given the popularity of Musk and the anxiety related to FSD technology, the test has been bashed as propaganda and staged.
Shortly after the Dawn Project released the safety test result, Fred Lambert, editor-in-chief of Electrek, in a blog claimed that the driver had actually failed to activate FSD Beta during the test. As a footnote to his blog, Lambert notes that since the controversy, the Dawn Project released additional footage that doesn’t appear in the ad, and is inconsistent with the results published about the test.
In fact, a Twitter user even conducted a similar test and uploaded a video showing how Tesla detected the presence of a cardboard child and avoided it every time.
The accusation of the ad campaign being “staged” seems valid in the backdrop of the Dawn Project clearly mentioning the issue of Tesla’s FSD program as its primary campaign agenda. The campaign website clearly mentions, “The first danger we are tackling is Elon Musk’s reckless deployment of unsafe Full Self-Driving cars on our roads.”