Tesla’s autopilot system can be easily tricked into driving a car without a driver. According to experts from the non-profit organization Consumer Reports, the investigation of this problem appeared after a fatal accident in Texas, when no one was driving a Tesla car.
By using a weighted chain attached to the steering wheel to simulate the pressure of the driver’s hands, two Consumer Reports safety researchers were able to use the Tesla Model Y system to increase vehicle speed. The car was able to drive several laps on the test track while the driver was in the passenger seat.
Experts warned other people against trying to trick Tesla’s autopilot in this way, noting that the experiment should not be carried out by anyone other than trained professionals.
The experiment does not provide a concrete understanding of the Texas accident, but safety advocates and researchers at CR say it does show that driver monitoring systems need to work harder to prevent drivers from using systems in predictably dangerous ways.
BMW, Ford, GM, Subaru, and others use camera-based systems that can track eye movements or the position of the driver’s head to make sure they are looking at the road.
Certain vehicles, including those equipped with GM’s Super Cruise system, may automatically decelerate to a stop if they find that drivers have ignored repeated warnings to look at the road.
Let me remind you that we also wrote that the IS researcher found that the Tesla Model 3 interface is vulnerable to DoS attacks.