Why were they driving so fast? Why isn’t the recording protected? What was Autopilot’s role in this?
GreenTheOnly shared an impressive video on Twitter on November 12. It showed a crash recorded by TeslaCam. The first thing that comes to anyone’s mind is: “How can someone drive that fast?” It is irresponsible. I contacted Mr. Green to ask him for a YouTube link for that crash. I wanted to embed it to an article about traffic safety but learned from him Autopilot was likely involved. He asked me to wait until he had more data – so here we are.
If you are wondering how Mr. Green got this footage, remember he is always after more information on Tesla’s software and systems. That is how he discovered ICEs and MCUs were ending up on eBay disclosing its owners' data in a potentially dangerous data leak. InsideEVs revealed this story on May 3.
In this case, GreenTheOnly retrieved the information from an HW 2.5 unit a junkyard sent him, as he states below.
This brings up another issue involving Tesla: the fact that it installed the HW 2.5 in many vehicles after announcing all of them would have HW 3.0 from April 23, 2019, on. Affected customers are still trying to sort this out with the company, and we never heard back from Tesla why it was doing it.
Data protection is also a concern around this crash. As much as the computers for sale on eBay, all Teslas that end up in junkyards have footage or data that exposes its former owners. Some, such as the Model 3 owner from Seattle that saw his car end up in Ukraine, took care of erasing the data there, but how many of these owners take the same precautions?
All of them should: all it takes is somebody with the right programming skills to retrieve them. Luckily, GreenTheOnly is a white-hat hacker. Sadly, not all hackers out there have the same good intentions he does.
As much as he wanted to warn owners about the data leak and Tesla to prevent that from happening, this crash had something fishy about it. Why would anyone accelerate so much in a single lane and rear-end another car so violently? The hacker had a hunch about that.
“It seems that Autopilot might have been steering and the user sleeping and just pressing the accelerator in their sleep.”
Some tweets implied that it was impossible because Autopilot was not engaged, but they’re wrong. And the video below explains why.
When Autopilot disengages while you are driving, it does not simply stop working. It waits for torque on the steering wheel to do so and will only engage again after the driver stops and selects P. However, if the steering wheel is not touched and the accelerator pedal is still pressed, the system will keep steering the car within the same lane and asking the driver to take control until that happens – or a crash puts an end to the journey.
The beta software understands the driver is still in control if he is pressing the pedal, but they may have fallen asleep or be experiencing a medical urgency. Unfortunately, we have no idea what the Tesla driver's condition was. We only know two people got seriously injured, according to GreenTheOnly.
Whatever triggered the FCW (Front Collision Warning) that theoretically disengaged Autopilot 40 seconds before the crash, that event indicates the driver was not controlling the car for quite some time already.
With the help of the logs, Mr. Green is decoding the events. He created the video above with overlaid CANbus data. The red flashing represents the Autopilot warnings for the driver to take control, but that didn't happen for reasons unknown. The car kept on accelerating until the crash, as the data below reveals.
GreenTheOnly had the help of an engineer to make these graphs and he is still checking the logs, but he has no doubt that Autopilot's way of handling this disengagement has to be improved.
“It is concerning that Tesla allows you to do this. You would think that when the car is steering it would also limit acceleration and avoid running into stuff it sees – same as it avoids getting outside a lane. I understand that just stopping might be unsafe, but following a car in your lane without running into it is a lot safer than just stopping – or hitting it, obviously.”
Tesla apparently does not have a press department anymore. Regardless, we sent the company questions about why Autopilot works like this, why it does not brake if it detects something ahead despite the accelerator and if there are plans to change those parameters through an OTA upgrade.
Legally speaking, the company is covered by its disclaimer that Autopilot is in beta testing, and anyone that decides to use it takes full responsibility for whatever happens. From an ethical perspective, Mr. Green asks Tesla the following question:
We'll get back to whatever Mr. Green has to add to this.