On 7/4, Elon Musk, Tesla's CEO, wrote on X: "Tesla autopilot has saved many lives — the statistics are undeniable. Of course, that doesn't mean it's perfect." This statement comes as Tesla faces increasing scrutiny and lawsuits related to its self-driving technologies.
However, Tesla has never released data to substantiate this claim. Critics argue that Musk is using this narrative to preemptively frame the lawsuits Tesla faces regarding accidents allegedly caused by its full self-driving (FSD) system, portraying them as an unavoidable cost of progress.
Musk elaborated on his perspective, stating, "Even if we improve safety tenfold, potentially saving 90% of the one million people who die in car accidents annually, Tesla will still be sued for the 10% who died. The 90% who live will mostly not even know Tesla saved their lives. Still, it is the right thing to do." He believes that despite saving a majority, the company will inevitably face legal challenges for any remaining fatalities.
Alongside his post, Musk reshared a viral story about a Model 3 using FSD mode that avoided a pedestrian who suddenly ran onto the highway, illustrating a perceived success of the system.
The only safety data Tesla publicly releases about FSD is its quarterly vehicle safety report. This report compares miles driven using Autopilot or FSD to the U.S. national average for miles per accident, as published by the National Highway Traffic Safety Administration (NHTSA).
Independent researchers have criticized this comparison for years, citing at least 4 ways it biases Tesla's safety claims. These discrepancies raise questions about the validity of Tesla's self-reported safety statistics.
The first discrepancy is road type. Autopilot and FSD are primarily used on highways, which are inherently the safest roads by miles traveled. In contrast, the NHTSA baseline combines highways with city roads, rural roads, and parking lots, where accidents occur much more frequently, thus skewing the comparison in Tesla's favor.
The second discrepancy is vehicle age. On average, Teslas are among the newest cars on U.S. roads. Newer models with modern passive safety systems, automatic emergency braking (AEB), and lane keeping systems inherently have lower accident rates compared to the average 12-year-old vehicle in the NHTSA fleet, regardless of FSD system activation.
The third discrepancy is driver demographics. Tesla owners tend to be older, wealthier, and more urban than the general driving population in the U.S. This demographic group already exhibits a lower-than-average accident rate, further biasing the comparative safety data.
![]() |
Drivers remove both hands from the steering wheel in a Tesla with an activated automated driving assistance system. *Photo: Tesla*The final discrepancy is accident definition. Tesla only counts an accident when an airbag or other explosive safety system is deployed. The NHTSA accident standard, however, is based on police-reported incidents, which include a large number of minor collisions without airbag deployment.In reality, Tesla is comparing "accidents severe enough to trigger airbags" with "any police-recorded accident"—two vastly different metrics. Safety researcher Phil Koopman and others have highlighted this as one of the largest discrepancies in Tesla's data, indicating a significant methodological flaw.Tesla does not publish data on disengagements, accident severity, miles driven by road type, or the specific methodology used to calculate its own accident rates. In contrast, Waymo publishes expert-reviewed safety comparisons with equivalent baselines for both human and autonomous driving on the same routes, demonstrating enough transparency for insurance companies like Swiss Re to conduct their own analyses of Waymo's fleet.A more revealing aspect of Musk's post is his perspective on lawsuits. He frames them as inevitable: Tesla saves 90% of lives, but is sued by the families of the remaining 10%, viewing this as simply the cost of doing the right thing and advancing technology.Tesla currently faces lawsuits—and has lost or settled some—for accidents where plaintiffs allege Autopilot or FSD actively contributed to the incident. The issue is not merely that the system failed to save the driver, but that it allegedly made errors a normal driver would not, or that Tesla's marketing strategy led drivers to over-rely on the system in situations beyond its capabilities.My Anh (according to Electrek) |
Drivers remove both hands from the steering wheel in a Tesla with an activated automated driving assistance system. *Photo: Tesla*
The final discrepancy is accident definition. Tesla only counts an accident when an airbag or other explosive safety system is deployed. The NHTSA accident standard, however, is based on police-reported incidents, which include a large number of minor collisions without airbag deployment.
In reality, Tesla is comparing "accidents severe enough to trigger airbags" with "any police-recorded accident"—two vastly different metrics. Safety researcher Phil Koopman and others have highlighted this as one of the largest discrepancies in Tesla's data, indicating a significant methodological flaw.
Tesla does not publish data on disengagements, accident severity, miles driven by road type, or the specific methodology used to calculate its own accident rates. In contrast, Waymo publishes expert-reviewed safety comparisons with equivalent baselines for both human and autonomous driving on the same routes, demonstrating enough transparency for insurance companies like Swiss Re to conduct their own analyses of Waymo's fleet.
A more revealing aspect of Musk's post is his perspective on lawsuits. He frames them as inevitable: Tesla saves 90% of lives, but is sued by the families of the remaining 10%, viewing this as simply the cost of doing the right thing and advancing technology.
Tesla currently faces lawsuits—and has lost or settled some—for accidents where plaintiffs allege Autopilot or FSD actively contributed to the incident. The issue is not merely that the system failed to save the driver, but that it allegedly made errors a normal driver would not, or that Tesla's marketing strategy led drivers to over-rely on the system in situations beyond its capabilities.
My Anh (according to Electrek)
