Tesla Autopilot might seem like it is bringing the future into the hands of its drivers… but could these advancements be endangering roads instead? 

A new report by the National Highway Traffic Safety Administration (NHTSA) found that Tesla’s iconic self-driving systems…

Are not as reliable as the company makes us believe. 

Tesla Autopilot Investigations

After extensive investigations into the company

The NHTSA discovered that Tesla Autopilot systems had been involved in over 13 fatal crashes. 

The crashes involved one or more deaths and serious injuries

In which “foreseeable driver misuse of the system played an apparent role”. 

Tesla’s autopilot system makes the vehicle engage in partial self-driving. It allows the cars to steer and break within their own lane. With the advanced system, the car can change lanes on the highway. However, under this system, drivers still need to be engaged with the driving

AND THE AUTOPILOT DOES NOT MAKE THE VEHICLES FULLY AUTONOMOUS. 

In 2021, the NHTSA launched an investigation into the autopilot system…

And subsequently had to reopen the investigation when Tesla issued its largest-ever recall. The company issued the recall of 2.03 million vehicles to adjust the system. 

Eventually, the subsequent investigation found that the recall did not do enough to fix the issues within the self-driving system. 

TESLA THEMSELVES STATED IN DECEMBER THAT THE AUTOPILOT SYSTEM “MAY NOT BE SUFFICIENT ENOUGH TO PREVENT DRIVER MISUSE”.

And this is not the first time Tesla has caught flack for their self-driving software being dangerous. During the Superbowl, ads were aired calling for a full Tesla boycott due to this technology.

The Future Of Self Driving 

Since 2016, NHTSA opened over 40 special crash investigations into Tesla for its autopilot features. 

Another issue the administration takes with the company is that the name “autopilot” is incredibly misleading…

As it can lead people to believe the system is autonomous. 

This is more than clear considering that the NHTSA found that drivers are not as careful as they should be with the software:

“ONE OF THE THINGS WE DETERMINED IS THAT DRIVERS ARE NOT ALWAYS PAYING ATTENTION WHEN THAT SYSTEM IS ON,”

Tesla has since enacted new protocols to prevent driver misuse of the software, but critics argue that it’s not enough. 

Regardless, the endless probes into Tesla’s autopilot system don’t inspire confidence among new buyers.

Especially since the company is recalling vehicles left and right. The latest Tesla release, the Cybertruck, has already been recalled for issues with the accelerator. 

Hopefully, the investigations and seemingly endless recalls will finally make Tesla open its eyes…

And understand that automation is fun but when it comes to driving…

The safety of your customers comes first. 

Be Great, 

GCTV Staff

Disclaimer: This content is intended to be used for educational and informational purposes only. Individual results may vary. You should perform your own due diligence and seek the advice from a professional to verify any information on our website or materials that you are relying upon if you choose to make an investment or business decision. Investment, real estate, and business involve great risk and there is no guarantee of performance or results.We are not attorneys, investment advisers, accountants, tax professionals or financial advisers and any of the content presented should not be taken as professional advice. We recommend seeking the advice of a financial professional before you invest, and we accept no liability whatsoever for any loss or damage you may incur.