New research from a security expert has shown has laser devices can be used to confuse and even completely disable self-driving cars, opening up the possibility that people could maliciously attack the vehicles when they are on the road.
Jonathan Petit, the principal scientists at security software company Security Innovation, carried out a number of tests using simple and cheap equipment. He was able to cause self-driving cars to slow down or stop completely by using lasers bought from shops in combination with a small computer device, with the overall set-up cost being $60.
“I can spoof thousands of objects and basically carry out a denial of service attack on the tracking system so it’s not able to track real objects,” Mr Petit told IEEE Spectrum. “I can take echoes of a fake car and put them at any location I want.” He later added that he was concerned that manufacturers producing self-driving cars had not considered this form of security risk and that they would have to do so in the future. “I don’t think any of the lidar manufacturers have thought about this or tried this,” he said.
The exploit works because self-driving cars, such as the ones used by Google, rely heavily on a type of technology known as lidar, This is a combination of radar and light that fires tiny lasers in every direction and then measures how long they take to return to the sensor, allowing the on-board computer to map out objects around it. The lasers fired from the pen disrupt this and fool the sensors into believing there are objects that are not really there.