Although autonomous vehicle safety concerns often arise over what the vehicle’s systems cannot see, new research exposes a potential Tesla vulnerability in which they investigate “phantom” objects that the autopilot feature picks up on that are not really there. Ben Gurion researchers have been carefully documenting and experimenting with how phantom images can be used to trick semi-autonomous driving systems, using split-second light projections on roads to force Tesla vehicles to automatically stop.
The technology forces Tesla’s autopilot to see spoofed images of road signs or pedestrians, bringing the car to a screeching stop. The same trick can also be conducted with a few frames of a road sign injected on a billboard’s video, according to researchers at Ben Gurion. Security professionals now warn that if an internet-connected billboard is hijacked by threat actors, they could be used to cause road accidents among Teslas driving in autopilot, leaving little evidence of interference.
Read More: Split-Second ‘Phantom’ Images Can Fool Tesla’s Autopilot