Tesla rolls back its latest ‘Full Self Driving’ beta ‘temporarily’

Tesla’s decision to test its advanced “Full Self Driving” driver assistance software with untrained vehicle owners on public roads drew close scrutiny and criticism, and that was before this latest release.

Version 10.3 started rolling out Saturday night / Sunday morning with a long list of release notes. The list mentions changes starting with the introduction of driver profiles that can switch between different characteristics to track distance, rolling stops or leaving passing lanes. It is supposed to better detect the brake lights, turn signals and hazard warning lights of other vehicles, as well as reduce false decelerations and improve compensation for pedestrians.

However, on Sunday afternoon, Elon Musk tweeted that Tesla “sees problems with 10.3, so temporarily reverts to 10.2”.

(As always, to be clear: This software doesn’t make Tesla’s cars fully autonomous. Tesla CEO Elon Musk even said he believes the “full-feature” version of the software his company calls ” Full Self-Driving ”will, at best, be“ likely ”to drive someone from their home to work without human intervention and will still require supervision. This does not describe a fully autonomous car.)

While several drivers have already shared videos and impressions of their experience with the outing – whether or not that’s what Tesla wants attendees to share on social media – testers say the rollback update completely removes the FSD beta capabilities of their cars.

While several posters indicated that Update 10.3 introduced Phantom Frontal Collision (FCW) warnings, other issues noted included an endangered automatic guidance option, issues with Traffic Sensing Cruise Control (TACC). ) and occasional autopilot panic. It’s unclear how common these issues are and which, if any, caused the rollback, but Musk responded to a tweet about the Autosteer and TACC issues. saying that the company is working on it.

If this is a common problem within the test group, a phantom FCW would certainly be serious enough to initiate a rollback. In 2019, there was a Mazda3 recall to fix issues with the smart braking system mistakenly detecting objects in the car’s path. If another car is following close behind, vehicles suddenly braking for no reason – as several social media posts claim – could easily cause an accident. Another problem for testers is that several affirmed the bogus FCW incidents lowered their Tesla rated “safety score” low enough that they may not be able to stay in beta.

For anyone concerned about being a reluctant member of the test group by simply existing near a Tesla using work-in-progress software, this could be a sign that the company is fixing issues quickly or an example of its dangerousness. On the other hand, for Tesla owners who are hoping the test will extend to people with a lower safety score, hacker @greentheonly tweets, “To those of you with low scores and waiting for FSD: don’t do it. Imagine driving as the app requires, that would be awful, wouldn’t it? But the car drives even worse! It’s seriously totally unusable in any kind of traffic and on narrow roads the videos don’t do it justice.