- Quick Look at the 2020 Volkswagen Atlas Cross Sport | MotorTrend - March 13, 2024
- BMW Design – 2009 BMW Z4 – 2009 Detroit Auto Show - March 11, 2024
- Top 10 Car Features Women Love - October 7, 2023
Latest Viral Tesla Crash Video Confirms What We Should All Already Know | Carscoops
Autonomous driving features in production vehicles made today require constant supervision
5 hours ago
by Stephen Rivers
A crash that took place back in March of 2022 is making new waves on the web. Posted by noted Tesla hacker Green on Twitter, the accident shows a car that had evidently been on Autopilot smashing into a disabled car in the middle of the road. The ensuing storm of comments from both supporters and detractors of the software seem to be missing a few key points.
Tesla has come under an enormous amount of fire for its self-assuredly named Autopilot and Full Self-Driving features. Both provide autonomous driving features but both require constant attention from the driver too. Many argue that each one falls far short of what its name promises.
The crash you see below only makes that case more understandable. According to Green, who claims to have the actual computer from the Tesla in the crash, Autopilot was active until about two seconds before the impact. At that point, the Tesla recognized the obstacle and applied its automatic emergency braking feature.
More: Police Arrest Driver Of Tesla That Plunged Off Devil’s Slide Cliff For Attempted Murder
Clearly, those two seconds of braking didn’t do much to reduce the danger on impact. Green tells Carscoops that there’s no way to be totally sure of what the driver was doing just prior to the crash. Initially, he thought that the driver hadn’t touched the wheel for about seven seconds when the accident occurred. He now feels like that length of time was shorter.
advertisement scroll to continue
Regardless of that, it was the driver behind the wheel, not Autopilot, that was at fault for this accident. Were they lulled into a false sense of security by Autopilot’s name or capability? Possibly, but that’s completely based on conjecture and it says nothing about the numerous other autonomous driving software packages available across the automotive market. Could Autopilot software do a better job of recognizing a threat and avoiding an accident? Surely, but again, any Level 2 autonomous software requires that the driver be alert and ready to take over.
Studies indicate that about 40 injuries per day involving disabled or stopped cars on the roadway. As Green also points out in his comments on this crash, driving distracted is a bad idea no matter what car you buy. The relevant vehicle logs are below.