This is one of the many reasons Tesla is wise to completely rewrite the technology, but will it help?
UK-based YouTube channel Tech Tekk is able to fool the Tesla Model 3's traffic sign recognition system with fake road signs. It successfully changes the speed limit, creates fake stop points, and exposes some glitches with the system.
It's really not hard to "trick" a car's safety systems. However, in the real world, this type of situation would be rare. If someone went out of their way to trick cars by modifying signs, they'd likely be subject to a criminal investigation.
Sure, a leaf could fall and stick onto a wet sign and block its numbers, or snow could cover a traffic sign, not to mention glare from the sun shining in a certain way or a sign damaged by bad weather or an accident. If a traffic sign is altered, there's a really good chance the car's technology will take notice.
This can be seen as good or bad. If the technology is really "reading" the sign, and you change the sign, it will change how it reads it, plain and simple. So, how do companies make systems that are 100-percent foolproof? That's yet to be determined. However, if Tesla is really going to release cars that can completely drive themselves with zero intervention, this is a necessity.
Remember, Tesla encourages people to hack its cars and then provide the company with information to help fix the issue. However, we don't think Tesla had fake traffic signs in mind. Elon Musk has been touting the complete software rewrite of Autopilot and Full Self-Driving Capability. He calls it a "Quantum Leap."
Next week, the software rewrite will be pushed out to select Tesla owners in beta form. Musk insists the cars can now drive themselves without intervention in areas they've never visited before. If this is the case, what is Tesla doing about those pesky traffic signs, and all the folks out there altering them to cause grief? Jokes aside, we're eager to learn how good Tesla's new software really is.