Tesla Autopilot, dangerous or a life saver?

Tesla Autopilot has been making the headlines for some time, and even recalls are on the table as a potential outcome.

According to the media, autopilot mode on TESLA cars is not safe to trust while driving. A fatal accident occurred on May 7th involving a Tesla Model S which was on autopilot mode at the time of the crash. While all of the information was publicly shared on June 30th it attracted mixed reactions about the technology used by Tesla and its safety.

Tesla Model S electric car zero emissions

Background on Tesla Autopilot

To give you some background Tesla Motors has grossed over 2 billion dollars from the sale of its latest Tesla Model S and Model X cars. These vehicles are equipped with assisted driving and autopilot although the company clearly states that the driver should be aware of the road and fully ready to engage should a discrepancy arise. This is a fact which is often overlooked by the mass media, many of whom seem to have a deep dislike of the company’s entrepreneurial leader Elon Musk, a man who often challenges the establishment.

Was the accident reported in timely fashion?

While there have been complaints about the late declaration of the crash, Tesla Motors goes on to explain that the news was to be shared once a federal investigation was launched. This supposed delay in reporting the incident just prior of a multi-billion dollar fund raising for the company has seen rumours and blatant untruths peddled as fact. US regulators, directed by NHTSA, are now assessing the S model with many voicing their concern that the Tesla technology wasn’t tested on the wide span of roads and against different driving behaviours.

Tesla’s response

Tesla on the other hand stated that “This is the first known fatality in just over 130 million miles where Autopilot was activated…What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.” On the matter, Elon Musk declared: “Fatal car crashes will decrease as autopilot replaces the role of error-prone humans on the road.”

When is autopilot not autopilot?

While opinions clash between qualifying the incident as a misshapen, condemning the technology or taking it as just another traffic accident that can happen to anyone anywhere, a common ground is emerging around the description of the technology. Should it really be called autopilot even though the driver’s attention is still demanded and engagement may come at any moment? Tesla has always offered clear advice to drivers to leave their hands on the wheel and as autopilot can only be activated by the driver the definitive decision is made by the driver. It is worth mentioning that even airplanes on autopilot require attention and monitoring. It’s just that there is an insignificant chance for an airplane to have another airplane tailing it rather than a car and a truck on a highway.

The future

We haven’t yet reached the level of technological advancement required to let a car drive you home after a night on the town. While Tesla has made great improvements and is continually furthering our use of technology, we shouldn’t be lulled to false comfort and rather be aware of the fickle environment and circumstances we live in today.

Note: Tesla has since flagged an issue with the automatic breaking system in connection with the fatal accident and does not believe the separate autopilot system was to blame. More details will emerge when ongoing investigations have been completed.

About: Khadija Ouajjani

Since 2012. Mechanical Design Engineer in the aeronautics industry. Mainly dealing with CAD, FEA, simulation and analysis for turbo-engines. Writing for EC since 2014. Garlic, Color Pencils, Open Systems, Coffee, Herbert, Final Fantasy VII, Writing, Tolkien, Mechanics, Deutsch, Nihongo, Herbs, Aïkido, Tea, Cinnamon, Motion, Friends.

One Response to Tesla Autopilot, dangerous or a life saver?

  1. Archie Quinn says:

    cars will never be automated legally in Australia, as we have kangaroos that sit on the sides of the roads for hundreds of kilometres and jump at the last second, only a seasoned Aussie knows how to respond, ya run the fuckers over that is in fact the road law here for all cars, you may not swerve, you may only attempt to slow down, in semi trailers the law is you must maintain speed, this is the same in all countries to avoid jackknifing, a loose load visible to a dive on the back of a truck or trailer is not nor can ever be recognized by an auto car nor a skyward falling branch or hillside rockslide coming down a hill, nor the difference between a puddle or washed out road, nor a car jacker with a rock or gun, nor tumble weeds nor flocks of birds nor a policeman waving you down nor can it see smoke coming from over the crest from another accident or bushfire and fallen trees the list is endless, its a fucking stupid idea, and it is no better than cars on tracks, because essentially to remove all the hazards that is all you have. additionally i would never own any car that has a program to run it, as eventually they will be hacked and cars will be used as assassination tools to kill those they do not want, and if Wikileaks can hack the governments top files, then i assure you there is no crap car programs safe from the very best hackers. Eventually they will all be banned.

Leave a Reply

CLOSE
CLOSE
Skip to toolbar