Safety researchers have shown how hacked / fake license plates could be used to confuse autonomous cars - forcing them to crash or brake sharply.
Autonomous driving systems have risen in recent years, but not without error, confusion and accidents.
Vehicle intelligence has a long way to go before it can be considered fully autonomous and safe in use without the supervision of a human driver and as companies technology continue to improve platforms their focus tends to be on the weather, the mapping and how cars must respond to dangerous objects - such as people on the road or other cars.
However, according to Wired, there may be other, invisible dangers that humans cannot detect with the naked eye.
New research by academics at Ben Gurion University in Israel suggests so-called "ghost" images - such as a stop that may appear on an online billboard - could confuse systems AI and cause specific ενέργειες or movements.
This could not only cause traffic jams but also more serious traffic accidents accidents, with hacker to leave a little information about activities - and leave drivers wondering why their smart vehicle suddenly changed behavior.
Fake signs appearing on an online billboard could cause cars to "brake or crash," said security researcher Yisroel Mirsky.
The tests were performed on a vehicle using the latest version of Tesla Autopilot and MobileEye. According to Wired, a fantastic stop signal that ran for 0,42 seconds fooled Tesla, while only 1/8 of a second was enough to deceive MobileEye.