Tesla's Full-Self Driving Software Is A Mess. Should It Be Legal?
Whether that’s achievable remains to be seen, but an assessment by Forbes of the latest version of FSD found that it remains error-prone. During a 90-minute test drive in Los Angeles, in residential neighborhoods and freeways, the 2024 Model Y with Tesla’s latest hardware and software (Hardware 4, FSD version 13.2.9) ignored some standard traffic signs and posted speed limits; didn’t slow at a pedestrian crossing with a flashing sign and people present; made pointless lane changes and accelerated at odd times, such as while exiting a crowded freeway with a red light at the end of the ramp. There’s also no indication the company has fixed a worrisome glitch identified two years ago: stopping for a flashing school bus sign indicating that children may be about to cross a street.
In fact, there are so many easy-to-find problems with the feature, recently redubbed “Full-Self Driving (Supervised),” it raises a question: Why is the $8,000 feature, or a $99 a month subscription, even legal in its current form?
Turns out, there’s a simple answer: “Driving-assist systems are unregulated, so there are no concerns about legality,” said Missy Cummings, a George Mason University professor and AI expert who has advised the National Highway Traffic Safety Administration on autonomous vehicles. “NHTSA has the authority to step in, but up to now they’ve only stepped in for poor driver monitoring.”
Road Closed and Do Not Enter Signs
Nine years after the first Tesla owner in the U.S. was killed in an accident while using the company’s Autopilot driver-assist software – Joshua Brown in May 2016 – U.S. regulators have yet to set clear rules for so-called Level 2 driver-assist technology – the category FSD falls into – beyond requiring in-cabin monitoring to ensure human drivers pay attention to road conditions. This loophole creates an opportunity for Musk to promote FSD, first rolled out in 2020, as a feature that provides virtually autonomous driving with “minimal” human intervention, according to the in-car display. And as a metric for his future compensation package, its broader adoption is massively lucrative for him.
NHTSA, which opened an investigation last month into Tesla’s failure to report FSD and Autopilot accidents in a timely manner, said it “does not pre-approve new technologies or vehicle systems.” Instead, it’s up to carmakers to certify that vehicles and technologies meet federal safety standards. If an investigation finds a system to be unsafe, “NHTSA will take any necessary actions to protect road safety,” a spokesperson said.
Musk’s comments about FSD’s capabilities aren’t nuanced: “Tesla self-driving massively improves your quality of life and safety for the thousands of life hours you’re in a car.” The company also promotes the system with videos showing seemingly flawless performance in road trips. Yet Tesla is more circumspect in what it tells NHTSA. “Tesla describes its systems as Level 2 partial automation (including “FSD Supervised”), requiring a fully attentive driver who is engaged in the driving task at all times,” the agency said.
Scrutiny of Tesla’s technology is increasing after legal setbacks, including the launch of a federal class-action lawsuit by Tesla owners over Musk’s exaggerated FSD and Autopilot claims and efforts by California’s DMV to bar the company from using those names for the features in that state. The jury in a federal trial in Florida also determined last month that Tesla was partially responsible for a fatal 2019 crash that occurred while its Autopilot feature was engaged, ordering it to pay $243 million in damages. The company is appealing the verdict; it settled two other lawsuits last week over crashes in California linked to Autopilot, which has more limited driving-assist features than FSD.
The Austin-based company is also operating a pilot robotaxi project in its home city for a limited number of Tesla owners who are charged a nominal fee to use it. The version of FSD it uses has slight modifications to the version offered to customers, though it hasn’t explained those in detail. Since launching the service in June, Tesla has run into a number of problems, including reports of three crashes in one day in July.
Unlike Waymo, which operates its robotaxis in full autonomous mode in multiple U.S. cities, there’s currently a safety driver in the front seat to monitor FSD’s performance.
“This is an alpha-level product. It should never be in the customer's hands. It's just a prototype. It's not a product.”
There are 59 fatalities from crashes involving the use of Autopilot and FSD, according to a running tally by TeslaDeaths.com, compiled from news reports.
Musk and Tesla didn’t respond to a request for comment on FSD’s safety issues, but an updated version of the software is expected soon.
Flashing Pedestrian Sign
Forbes recently did a 90-minute assessment of Tesla’s most recent FSD, updated on Aug. 28, in a 2023 Model Y with Tesla’s latest hardware. The vehicle was owned by the Dawn Project, an organization created by software developer Dan O’Dowd, who in recent years has become a vocal critic of FSD, even spending his own money on Super Bowl commercials to raise awareness of its flaws.
In its current form, “this is not even a beta system. This is an alpha-level product. It should never be in the customer's hands,” O’Dowd told Forbes. “It's just a prototype. It's not a product.”
FSD, at times, can feel like an autonomous driving system. Punch in an address, and it takes off with no problems in hands-free driving mode. It signals when making a turn, stops at stop signs and traffic lights, and is generally observant of its surroundings, based on virtual images displayed on the center console. Yet the frequency of errors that occur, even in light traffic and in optimal weather conditions, particularly on urban streets, means a human behind the wheel must be highly attentive and ready to take control. And because of that, if you are at all a mindful driver, using it is not simpler or more relaxing than when a human is driving.
Two years ago, O’Dowd’s Dawn Project conducted a simple test of FSD involving a parked school bus. As the Tesla approached, a lighted stop sign on the side of the bus flipped out, alerting drivers to stop to allow children to safely pass. In the original test, the Tesla failed to stop every time, not even slowing when a child-sized mannequin crossed its path at the front of the bus. This month Forbes replicated the test and the Tesla failed again; it didn’t stop for the warning sign and it once again ran down “Timmy,” the Dawn Project’s mannequin.
Shortly afterward, we repeated the test with a Waymo summoned directly from the app. The car stopped for the sign; it didn’t move until it was retracted, and it did not run over Timmy.
School Bus Stop Sign And Timmy
“We reported the Tesla school bus thing over two years ago and they didn't do anything about it,” O’Dowd said. “We put it in the New York Times in a full-page ad. Then we did a Super Bowl commercial showing it. … But [Tesla] didn't do anything about it.”
And it’s not just flashing school bus signs. FSD also doesn’t appear to stop at train tracks when the crossing gate comes down and lights are flashing, according to some owners. Reports of disturbing glitches are not uncommon. One owner said his Tesla Model 3 with FSD version 13.2.9 suddenly stopped midway through executing an unprotected left turn, with an oncoming vehicle headed for him.
“I have this car bearing down on me at about 45mph, and I'm stopped, hanging out halfway in its lane,” according to a Tesla Motors Club post. “I fairly quickly pressed the accelerator to override FSD when I got my next surprise. The car quickly moved... in reverse!” Fortunately, that allowed the car to avoid the oncoming car and there was no one behind the owner, who was rattled by the experience.
In its evaluation, auto reviewer and researcher Edmunds finds improvements in FSD software, but “annoying” problems. “It won't make slight adjustments within the lane to avoid objects in the road, such as roadkill, blown-out tire debris or potholes,” Edmunds said. “That is not great, but it's made worse because FSD doesn't like you taking control to make those corrections either. Turn the steering wheel just a bit too much to avoid an object in the road and the entire system abruptly turns off.”
Edmunds doesn’t recommend that customers pay $8,000 for FSD in its current form.
FSD’s issues avoiding road debris were more than annoying for two Tesla influencers trying to complete Musk’s promise of having a vehicle drive autonomously across the U.S., first made in 2016. About 60 miles after setting out from San Diego, their new Model Y failed to avoid a large metal object in the middle of their highway lane, causing severe damage to the vehicle’s underbody, according to a video posted by one of the influencers, Bearded Tesla Guy.
Neither Consumer Reports nor the Insurance Institute for Highway Safety, whose assessments are heavily influential on the auto industry and regulators, has published detailed evaluations of FSD, though a Consumer Reports editor said on a recent podcast that his experience with FSD was reasonably good. However, IIHS rates Tesla’s driver-alert features for FSD to be poor.
NHTSA has the authority to issue a “stop sale” notice to automakers in the event it finds significant safety problems for specific models or features. In 2016, it sent a “special order” letter to startup Comma.ai over an after-market product the company was selling to partially automate driving for certain auto models. That led Comma to halt plans to sell the device at the time.
NHTSA should do something similar with FSD, according to O’Dowd, though it’s unclear if it will. For now, the Dawn Project is providing driving evaluations like the one conducted for Forbes for public officials, such as California Attorney General Rob Bonta and U.S. Representative Salud Carbajal, to demonstrate the feature’s shortcomings. Green Hills Software, his company, is a long-time supplier of software to the Defense Department and aircraft manufacturers, and funds the Dawn Project. He declined to detail its current budget, but said it’s “substantial.”
“A drug company wouldn’t call something a universal, full cancer cure when it didn't actually cure cancer. No one would do that. You would be sued into the ground. You'd be thrown in jail and they'd take everything away from you,” O’Dowd said. “But [Musk] does it every day because no one in the government will take action. No regulators will take action at this point. That's kind of what we're here for.”
Mark Rosekind, former chief safety officer for robotaxi developer Zoox and the NHTSA administrator in 2016 when the first fatal Tesla Autopilot crash occurred, thinks a combination of new regulations for technology like FSD and validation by expert outside entities is needed.
“If you really want safety, transparency and trust in autonomous vehicle opportunities, you're going to need to strengthen and enhance the federal structures with really innovative approaches,” he told Forbes. That should include “complementary third-party independent, neutral programs …certain requirements that they go through to demonstrate the safety. Transparency that will build trust.”
For now, autonomous vehicle researcher Cummings sees a factor that may minimize the shortcomings of Tesla’s tech. “The one really good thing about how bad FSD is that most people understand it is terrible and watch it very closely,” she said.
More from Forbes
ForbesElon Musk’s Self-Driving Tesla Lies Are Finally Catching Up To HimBy ForbesWaymo Is A Trillion-Dollar Opportunity. Google Just Needs To Seize It.By ForbesElon Musk’s Robotaxi Dream Could Be A Liability Nightmare For Tesla And Its OwnersByThe above is the detailed content of Tesla's Full-Self Driving Software Is A Mess. Should It Be Legal?. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undress AI Tool
Undress images for free

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

ArtGPT
AI image generator for creative art from text prompts.

Stock Market GPT
AI powered investment research for smarter decisions

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

The project, dubbed “FOMC in silico,” digitally recreates a meeting of the Federal Open Market Committee—the decision-making arm of the U.S. central bank—using AI agents to represent actual board members. The research team fed each agent data on indi

Reads an introduction to the ever-expanding roster of nominees, featuring legal filings packed with made-up court rulings, phony books attributed to real authors, and an Airbnb host using AI to fabricate images suggesting a guest caused damages they

At the same time, traditional storage protocols are being replaced by newer technologies that better meet the needs of scalable, high-performance-driven AI workloads. Storage solutions for AI are increasingly choosing object storage over traditional block storage and file storage. This shift is ironic because object storage was originally developed as a scalable, durable and low-cost platform, mainly for conventional backups, archives, media content, and cloud-scale data lakes. However, unlike traditional file and block storage systems that are overwhelmed by the demands of large-scale parallel processing, object storage provides the scale-out capabilities and performance performance required by AI applications. Founded more than ten years ago, MinIO is an early leader in the object storage market. The company will

A Waymo vehicle without any passengers was traveling along Rural Road near Arizona State University in Tempe when it began slowing down to make a right turn into a parking lot—likely preparing to pick up its next rider. Waymo confirms the turn signal

The psychological impact of this transformation is profound. For years, assistive technology has been cumbersome, stigmatizing, and rigid—forcing users into a one-size-fits-all mold. But AI is rewriting that story, delivering personalized solutions t

Interestingly, new studies reveal that one of the most common uses for AI chatbots today is emotional and mental health support. Many users find it easier to open up about deeply personal matters they might hesitate to discuss with friends, family, o

Whether that’s achievable remains to be seen, but an assessment by Forbes of the latest version of FSD found that it remains error-prone. During a 90-minute test drive in Los Angeles, in residential neighborhoods and freeways, the 2024 Model Y with T

The company announced a 359% surge in its order backlog during the Q1 2026 earnings call, triggering a 36% spike in its stock value that day. This milestone propelled Ellison—owner of over 40% of the company’s shares—to the top of the Forbes Billiona
