Tesla goes beyond the “problem” to remove the fully self-driving beta

Tesla pulled Its Full Self Driving (FSD) Beta Leaving the table on the weekend, CEO Elon Musk said the tester said [version] 10.3. “

To resolve this issue, the company temporarily reverted to FSD 10.2. Musk announced on social media on Sunday morning. The next day, he had already promised that version 10.3.1 would be released to address an issue that occurred during a very short public testing phase.

“Note that this is expected in beta software” CEO.. “It’s a public beta because it’s not possible to test every hardware configuration under all conditions using an internal QA.”

Before explaining what really happened, let’s get a few things out of the way. Tesla has promised vehicle autonomy for years through an expensive full-self-driving suite, but FSD may not really be ashamed of its name. Elon Musk states that feature completes will probably always require some form of supervision.At best, it keeps the finished suite achieved Conditional automation (SAE Level 3) has not reached its promise to provide fully autonomous driving capabilities (SAE Level 6).

If teleported directly from 1991, even the worst version of FSD would be a technical wonder. But we are living a few years, ten years after the automotive industry promised that self-driving cars will become commonplace by 2020. It is also becoming clear that the trade-offs of implementing unfinished versions of these systems may not be worth it. Manufacturers are advancing driver monitoring protocols, Includes cabin-facing camera that tracks eye and face movements It seems to represent the flowering of a luxurious car experience.

There are also a number of legal issues regarding who is responsible when a self-driving car is involved in a collision. Despite reports highlighting the shortcomings of advanced driving assistance, the industry is keen to hold drivers accountable as a way to prevent them from seeing legal action taken against their business. increase. This also facilitated the influx of monitoring means, making it essential for autonomous driving systems to be almost fully functional. The latter has proven to be very difficult.

Based on the limited amount of time the FSD beta was active, it’s hard to really figure out what went wrong. However, there are many videos of citizens testing the system, showing some recurring issues. Beta vehicles clearly still had problems handling construction zones and badly marked lanes. Users also noted that the car became timid when using similar features, requiring the driver to regain control under conditions that were previously out of control.There were also some posts on social media Claim that their car had made a decision to invalidate certain safety settings No human input.

Reuters The driver was experiencing a forward collision warning when there was no imminent danger and reported that Tesla may have automatically braked to avoid obstacles in the phantom. Ironically, it was one of the systems that other users claimed that the vehicle was mysteriously blocked.

This may explain why Tesla demanded that beta testers receive a safety score to qualify.. But I’m not claiming that this is the best way for automakers to run software development programs.Refrain from accessing systems that have paid (and do not work) thousands of extras Because I couldn’t drive in a ridiculously conservative way. Somehow it doesn’t seem fair.

[Image: Tesla]

Become a TTAC Insider. Tell the truth about your car first with the latest news, features, TTAC takes, and everything else. Our subscription Newsletter..

Tesla goes beyond the "problem" to remove the fully self-driving beta

Source link Tesla goes beyond the "problem" to remove the fully self-driving beta

Back to top button