Tesla North America's official X account re-shared a video interview with a Cybertruck owner on 29/3, promoting the company's full self-driving (FSD) feature.
In the video, a man named Ricky explained that his vision was declining, prompting him to visit an ophthalmologist to discuss his ability to continue driving. The doctor, who also owns two Teslas, told Ricky he needed a car with Tesla's full self-driving software. The doctor then arranged a test drive for Ricky and even met him on a weekend to instruct him on how to use the system.
![]() |
Ricky (right) purchased a Tesla Cybertruck electric pickup truck after his ophthalmologist's advice. *Screenshot*.
During the test drive, Ricky stated that the car drove itself for 1,5 hours on three different routes, and he "did not touch the steering wheel at all". That experience convinced Ricky to buy the Cybertruck.
However, Electrek finds Tesla North America's promotion of this testimonial concerning and reckless. Tesla itself classifies full self-driving as a level 2 driver assistance system, requiring drivers to monitor it and be responsible for the vehicle at all times.
Tesla's support page for full self-driving (supervised) mode explicitly states that this system is a level 2 partial automation system according to the Society of Automotive Engineers (SAE) standards. This classification means that the driver must always remain focused and engaged in driving. The driver is responsible for the vehicle, regardless of whether full self-driving mode is engaged. According to Tesla's documentation, the system "does not make the car fully autonomous".
Autonomous driving modes can create complacency, even among experienced drivers. Raffi Krikorian, Mozilla's chief technology officer and former head of Uber's self-driving car division, who personally built self-driving cars and trained in safe driving, had an accident with his Tesla Model X while using full self-driving mode. The system's near-perfect performance led to his overconfidence. Research indicates drivers need 5-8 seconds to regain focus after an automated system returns control, but emergencies unfold more quickly.
Tesla driver sleeping while the car self-drives on a US highway, afternoon of 8/3. *Video: KTLA 5*.
The National Highway Traffic Safety Administration (NHTSA) has expanded its investigation into full self-driving for 3,2 million Tesla vehicles. The agency is also investigating over 80 traffic violations linked to full self-driving, and Tesla has struggled to provide accident data. A Cybertruck owner has filed a lawsuit alleging that full self-driving caused an accident. Tesla's CEO himself has stated that drivers using full self-driving should be allowed to text and drive - while still keeping full self-driving at level 2 to avoid legal liability.
Not all users fully recognize the risks when they rely on technology, entrusting their own safety and that of others on the road. There have been numerous instances where drivers let their cars self-drive while they sleep or even move to the back seat to record videos. Among these, some cases have resulted in accidents.
My Anh
