Tesla Autopilot has been making the headlines for some time, and even recalls are on the table as a potential outcome.
According to the media, autopilot mode on TESLA cars is not safe to trust while driving. A fatal accident occurred on May 7th involving a Tesla Model S which was on autopilot mode at the time of the crash. While all of the information was publicly shared on June 30th it attracted mixed reactions about the technology used by Tesla and its safety.
Background on Tesla Autopilot
To give you some background Tesla Motors has grossed over 2 billion dollars from the sale of its latest Tesla Model S and Model X cars. These vehicles are equipped with assisted driving and autopilot although the company clearly states that the driver should be aware of the road and fully ready to engage should a discrepancy arise. This is a fact which is often overlooked by the mass media, many of whom seem to have a deep dislike of the company’s entrepreneurial leader Elon Musk, a man who often challenges the establishment.
Was the accident reported in timely fashion?
While there have been complaints about the late declaration of the crash, Tesla Motors goes on to explain that the news was to be shared once a federal investigation was launched. This supposed delay in reporting the incident just prior of a multi-billion dollar fund raising for the company has seen rumours and blatant untruths peddled as fact. US regulators, directed by NHTSA, are now assessing the S model with many voicing their concern that the Tesla technology wasn’t tested on the wide span of roads and against different driving behaviours.
Tesla on the other hand stated that “This is the first known fatality in just over 130 million miles where Autopilot was activated…What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.” On the matter, Elon Musk declared: “Fatal car crashes will decrease as autopilot replaces the role of error-prone humans on the road.”
When is autopilot not autopilot?
While opinions clash between qualifying the incident as a misshapen, condemning the technology or taking it as just another traffic accident that can happen to anyone anywhere, a common ground is emerging around the description of the technology. Should it really be called autopilot even though the driver’s attention is still demanded and engagement may come at any moment? Tesla has always offered clear advice to drivers to leave their hands on the wheel and as autopilot can only be activated by the driver the definitive decision is made by the driver. It is worth mentioning that even airplanes on autopilot require attention and monitoring. It’s just that there is an insignificant chance for an airplane to have another airplane tailing it rather than a car and a truck on a highway.
We haven’t yet reached the level of technological advancement required to let a car drive you home after a night on the town. While Tesla has made great improvements and is continually furthering our use of technology, we shouldn’t be lulled to false comfort and rather be aware of the fickle environment and circumstances we live in today.
Note: Tesla has since flagged an issue with the automatic breaking system in connection with the fatal accident and does not believe the separate autopilot system was to blame. More details will emerge when ongoing investigations have been completed.