Tesla’s self-driving tech apparently has trouble not hitting little kids
A Tesla ran over a test dummy every chance it got in a new test.
Teslas are some of the most technologically advanced vehicles with their electric motors and self-driving technology. But it turns out that Tesla’s Full Self-Driving (FSD) tech has trouble not running over kids in its path.
A recent video from safety engineer Dan O’Dowd displays the obvious flaw with Tesla’s FSD technology. The video shows a Tesla in FSD mode consistently failing to avoid a toddler-sized mannequin in its path.
Now, it is important to note that this is a small number of tests, so results should obviously be taken with a grain of salt.
“The FSD software may be the most dangerous commercial software ever released onto public roads into the hands of over 100,000 untrained “beta” test drivers,” reads the official documents (PDF) from The Dawn Project’s test of Tesla’s FSD.
Tesla FSD running over fake kids is certainly not the only criticism it has seen. In fact, California’s Department of Motor Vehicles has just filed a complaint claiming that Tesla has been lying about FSD and autopilot capabilities.
An additional report from back in June found that Tesla vehicles were the cause of the majority of self-driving-related crashes in 2021.
It will be interesting to see how much traction The Dawn Project’s ad campaign will gain in the coming weeks. It’s obvious that there’s still a lot of work to do to make FSD safer and more reliable.
- Tesla loses its status as the world’s largest EV maker
- Toyota EVs will cost more as the company loses its tax credit
- Ford F-150 Lightning owners will reportedly be able to charge Teslas
- Volvo will stop selling gas guzzlers by 2030, moving solely to electric vehicles