Keen Labs, a cybersecurity research firm, say it managed to trick Tesla’s Autopilot, causing it to veer off course into oncoming traffic by using three small stickers on road pavement.

The two-time honoree of Tesla’s bug bounty hall of fame program said in an Autopilot research paper on Saturday that it discovered two ways to confuse Autopilot’s lane recognition through changing the physical road surface itself.

The first attempt to trick Autopilot used blurring patches on the left lane line, which the team says is too difficult for someone to deploy in the real world and easy for Tesla’s computer to recognize. “It is difficult for an attacker to deploy some unobtrusive markings in the physical world to disable the lane recognition function of a moving Tesla vehicle,” Keen said.

The researchers believe that Tesla handles this situation well as the company has already added many “abnormal lanes” in its training set of Autopilot miles. This enables Tesla vehicles a good sense of lane direction even in inclement weather or poor lighting.

Keen further attempted to trick Tesla’s Autopilot by making it think there was a traffic lane when one wasn’t actually present. The researchers painted three squares in the traffic lane that mimic merge striping which caused the car to veer into oncoming traffic in the left lane.

“Misleading the autopilot vehicle to the wrong direction [of traffic] with some patches made by a malicious attacker is sometimes more dangerous than making it fail to recognize the lane,” Keen said. “If the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident.”

According to Tesla, the issues that Keen found don’t represent real-world problems and that no drivers have ever encountered any of the report’s identified problems.

“In this demonstration, the researchers adjusted the physical environment (e.g. placing tape on the road or altering lane lines) around the vehicle to make the car behave differently when Autopilot is in use,” the company said. “This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”

While Keen’s findings weren’t eligible for the company’s “bug bounty” program, a Tesla spokesperson told Business Insider that the company holds the researcher’s insights in high-regard.

“We know it took an extraordinary amount of time, effort, and skill, and we look forward to reviewing future reports from this group,” a spokesperson said.

Leave a Reply