Fooling Tesla’s advanced Autopilot autonomous driving system is certainly not an easy process, but are there any tricks where they can bypass it? This is because one of the basic safety parameters of the Tesla autopilot is that it necessarily needs the presence of a human in the driver’s seat.
However, according to newer evidence, one can fool this system and this can lead to dangerous situations for the owner of the car. Many accidents occurred with Autopilot-enabled Tesla cars, including one of the most serious accidents that occurred recently in Texas, USA, where it was shown that there was no one in the driver’s seat.
Some experts also managed to fool the autopilot of a Tesla Model Y where it could normally run routes without someone in the driver’s seat. On the other hand, Tesla clearly states that it is necessary to use humans in the driver’s seat, and at regular intervals it should touch the steering wheel.
Also make it clear that Tesla’s current autopilot vehicles are a semi-autonomous driving system while full self driving features are still in Beta testing and are expected to be fully implemented in late 2021 or early next year. In fact, experts point out that Tesla’s way of checking whether someone is sitting in the seat is inadequate.
Discussions are also already taking place on how to improve the situation and several manufacturers are still equipping their cars with driver control systems without the application of full autonomous driving to its vehicles.