News
Tesla calling its driver-assist feature “self-driving” is misleading, says new report
Please don’t put full trust in your Tesla.
Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
So far, 2020 has been a great year for Tesla as its stocks continue to reach new heights. They even managed to make it through the current coronavirus pandemic relatively unscathed. Then they introduced the latest build of autopilot mode, and suddenly it seems that their road to international stardom is not that straightforward.
The first sign came in July this year when a German court ruled that Tesla’s use of the term “Autopilot” for its driver-assist feature is misleading. The Munich Regional Court opened a court case after a complaint from Germany’s Central Office for combating unfair competition. After reviewing the evidence, the court concluded that Tesla’s models couldn’t complete a journey on its own without the assistance of a human driver. That was the case even with the latest upgrade and with the “Full Self-Driving Capability” included in the test runs.
Fast forward to September, and Tesla is again accused of misusing a term that implies complete car autonomy, but fails to deliver. First came Consumer Reports’ testing that found a series of inconsistencies with Tesla’s Autopilot mode. They ran extensive tests, in various conditions, and by experienced car testers.
Then another study came to light, conducted by the American Automobile Association (AAA), that also scrutinized Tesla for misleading consumers by using the term Autopilot. Their study concluded that it matters a lot how we name the systems and programs in our cars.
The study was conducted in Washington D.C. with 90 participants. At the start, they were told that they could choose between two driver-assistance systems. One of the systems was named DriveAssist, whereas the other one was named AutonoDrive. Of course, those were completely fake names for the study. The researchers also used a disguised Cadillac and its advanced driving system for the test. Most of the new GM cars use Super Cruise, and it allows drivers to take their foot off the pedal and their hands from the steering wheel.
More than 40% of the participants using the AutonoDrive mode thought that the vehicle would do something on its own to avoid crashing. At the same time, just 4% of the participants driving the car in a DriveAssist mode thought the car would do something on its own to avoid crashing into another vehicle.
So far, the public has witnessed three fatalities in which Tesla drivers seem to over-estimate their vehicles’ capabilities. The first fatality happened in 2016. However, after investigating the case, federal agencies cleared Tesla. The other two are still part of ongoing investigations.
What do you think? How do you feel about self-driving systems in vehicles? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.
Editors’ Recommendations:
- Tesla Model Y owners are finding questionable wood trim under the hood of their vehicles
- Ford made a nutty electric Mustang with 1,400-horsepower and seven electric motors
- Digital cockpits in cars are here to stay and will become the standard this decade
- The iconic self-balancing Segway is ending production in July