Scroll Top

Researchers hacked a Tesla’s autopilot using three stickers on the road

futurist_adversarial_attacks

WHY THIS MATTERS IN BRIEF

Neural networks see the world differently to humans, which is why they can be corrupted using relatively simple trickery.

 

Interested in the future and want to experience even more?! eXplore More.

A prolific cyber security research firm has announced that it’s managed to make Tesla’s self-driving car feature veer off course, and into theoretical in coming traffic, just by sticking three small stickers on the road pavement. Also, it’s not the first time researchers have been able to trick self-driving vehicles into driving dangerously after elsewhere researchers stuck small stickers onto road signs that tricked cars into mis-reading speed signs saying 30 mph as 100 mph. All of which is the result of the way the Artificial Intelligence (AI) neural networks that control the cars see the world, and as researchers cotton onto the fact that this is a potentially major security concern, if for no other reason because it’s cheap and low-tech, they’re already becoming a worrisome occurrence.

Tencent’s Keen Lab, a two time honoree of Tesla’s Bug Bounty hall of fame program, said in a research paper that it found two ways to trick Autopilot’s lane recognition by changing the physical road surface in front of it.

 

RELATED
Nammo's drone swarms now come with tank killing weapons systems

 

The company’s first attempt to confuse Autopilot used blurring patches on the left-lane line, which the team said was too difficult for someone to actually deploy in the real world and easy for Tesla’s computer to recognise.

 

The explainer (in Chinese)
 

“It is difficult for an attacker to deploy some unobtrusive markings in the physical world to disable the lane recognition function of a moving Tesla vehicle,” Keen said.

 

The white dot is clearly visible

 

The researchers said they suspected that Tesla also handled this situation well because it’s already added many “abnormal lanes” in its training set of Autopilot miles. This gives Tesla vehicles a good sense of lane direction even without good lighting, or in inclement weather, they said.

Not deterred by the low plausibility of the first idea, Keen then set out to make Tesla’s Autopilot mistakenly think there was a traffic lane when one wasn’t actually present.

The researchers painted three tiny squares, which you can see in the picture, in the traffic lane to mimic merge striping and cause the car to veer into oncoming traffic in the left lane.

 

RELATED
Hackers inject Trojans and Ransomware into open source AI models to hack you

 

“Misleading the autopilot vehicle to the wrong direction [of traffic] with some patches made by a malicious attacker is sometimes more dangerous than making it fail to recognise the lane,” Keen said.

“If the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident,” they said.

In response to Keen’s findings, Tesla said the issues “didn’t represent real world problems and no drivers had encountered any of the report’s identified problems.”

“In this demonstration the researchers adjusted the physical environment, by placing tape on the road and altering lane lines, around the vehicle to make the car behave differently when Autopilot is in use,” the company said.

“This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”

 

RELATED
One camera to watch an entire city, meet WAMI

 

A Tesla representative told Business Insider that while Keen’s findings weren’t eligible for the company’s bug bounty program, the company held the researcher’s insights in high regard.

“We know it took an extraordinary amount of time, effort, and skill, and we look forward to reviewing future reports from this group,” a representative said.

However, while today, in a pre fully autonomous car world, Telsa mandates that drivers have to keep an eye on the road and “be ready in an instant to take over from the cars Autopilot system in the event of an emergency” tomorrow, when that driver is in the back of the car having a massage in a massage seat, something that’s being touted by Tesla competitor Toyota, this kind of hack could quickly become a very real real-world concern. So whether Tesla recognise it as a problem or not I for one suggest they look into ways of fixing it now before someone doesn’t come home for dinner…

Source: Business Insider

Related Posts

Leave a comment

EXPLORE MORE!

1000's of articles about the exponential future, 1000's of pages of insights, 1000's of videos, and 100's of exponential technologies: Get The Email from 311, your no-nonsense briefing on all the biggest stories in exponential technology and science.

You have Successfully Subscribed!

Pin It on Pinterest

Share This