Traffic surveillance footage shows a Tesla Model S vehicle changing lanes and then braking hard in the left-most lane of the San Francisco Bay Bridge, resulting in an eight-vehicle rear-end collision in November. If the driver told police he used Tesla’s new “Full Self-Driving” (FSD) feature, that information is now verified and confirmed by data released by the federal government on Tuesday.
Tesla fans claim that Autopilot and Full Self-Driving driver assistance systems are better than human drivers. Although it is still in beta version, Full Self-Driving can cope with various traffic situations, sometimes it behaves normally for hours at a time. This leads people to believe that the system is good enough to be used as a daily driving system. They soon become careless, attention slips, and at the worst possible moment they get a stark reminder that Full Self-Driving is still in beta for a reason: it’s still in development.
When this happens it can be tragic because often the human driver does not have enough time to react to a dangerous situation that the FSD Beta software cannot handle.
This may have been the case with the November pile-up on I-80 east of the San Francisco Bay Bridge. The driver told police he used Tesla’s new “Full Self-Driving” (FSD) feature, the report noted, before Tesla’s left turn signal was activated, his brakes were activated and that the car didn’t veer to the left, slowing down to a stop directly in the path [du deuxime vhicule] . The Tesla pulled into the fast lane before braking rapidly. The unexpected deceleration then led to a rear-end collision of eight vehicles.
The child injured in the crash was a 2-year-old boy who suffered a scratch to the left back of his head and a bruise, according to the detailed crash report. In a photograph from the accident, a stroller is parked in front of the car in which the child was injured.
A flurry of reports has surfaced in recent months in which Tesla drivers have complained of sudden phantom braking when the vehicle is at high speeds, nearly resulting in crashes in many cases. More than 100 such complaints were filed with NHTSA in three months, according to the Washington Post.
The newspaper explains that:

Although drivers tend to blame the FSD for their mistakes when an accident occurs, this time the driver was right, data released by the federal government confirmed on Tuesday. According to the investigation report cited by CNN, the controversial driver assistance software was activated about 30 seconds before the accident. The data also shows that the car suddenly slowed to 11km/h, a dangerous move in fast traffic.
It’s unclear what causes the phantom braking, which Tesla has yet to figure out and fix. Tesla stripped its cars of all sensors except the cameras, which could be the root cause. After all, humans do experience optical illusions, albeit rare ones. Certain conditions, such as a fast moving shadow on the camera, can cause the system to believe there is an object in front of the car and initiate braking.
NHTSA is already investigating hundreds of complaints from Tesla drivers, some describing near-misses and concerns for their safety. However, the agency has yet to take action against Tesla and the investigation is dragging on. Either way, analysts expect the recent findings from the San Francisco pile-up to prompt NHTSA to call for a resolution.
The term Full Self-Driving criticized by other car manufacturers
The term “Full Self-Driving” has been criticized by other manufacturers and industry groups as misleading and even dangerous. Last year, self-driving technology company Waymo, owned by parent company Google, announced it would no longer use the term.
Unfortunately, we find that some automakers use the term “autonomous driving” [ndlr. auto-conduite] inaccurately, giving consumers and the general public a false impression of the capabilities of (non-fully autonomous) driver assistance technology,” Waymo wrote in a blog post. This false impression can lead someone to unknowingly take risks ( such as taking your hands off the steering wheel) which could endanger not only your own safety, but also that of those around you.
While Waymo doesn’t name any names, the statement appears to be clearly motivated by Musk’s controversial decision to use the term Full Self Driving.
Similarly, the leading self-driving car lobby recently moved on to the Self-Driving Coalition for the Autonomous Vehicle Industry Association for Safer Streets. The change, the industry group said, reflects its commitment to “accuracy and consistency in how industry, policymakers, journalists and the public talk about self-driving technology.”
Transportation Secretary Pete Buttigieg also criticized emerging driver assistance technologies, which he said have not replaced the need for an alert human driver: I keep saying that; “All you can buy on the market today is driver assistance technology, not driver replacement technology,” Buttigieg said. I don’t care what it’s called. We need to make sure we’re crystal clear on this, even if companies aren’t.
While the language may evolve, there are still no federal restrictions on testing self-driving vehicles on public roads, although states have imposed limits in some cases. Tesla has not announced any changes to the program or its branding, but the crash was one of many that month. Several days before the November 18 Bay Bridge incident in Ohio, a Tesla Model 3 crashed into a stationary Ohio State Highway Patrol SUV with flashing hazard lights. The Tesla is also believed to have been in Full Self-Driving mode and is also under investigation by NHTSA.
Source: CNN
And she ?
How do you read it? Potential implications for Tesla?