You Can’t See Me: Physical Removal Attacks on LiDAR-based Autonomous Vehicles Driving Frameworks

New research by Sara Rampazzi and her team, reveals that expertly timed lasers shined at an approaching lidar system can create a blind spot in front of the vehicle large enough to completely hide moving pedestrians and other obstacles. The deleted data causes the cars to think the road is safe to continue moving along, endangering whatever may be in the attack’s blind spot.

This is the first time that lidar sensors have been tricked into deleting data about obstacles.

The vulnerability was uncovered by researchers from the University of Florida, the University of Michigan and the University of Electro-Communications in Japan. The scientists also provide upgrades that could eliminate this weakness to protect people from malicious attacks.

The scientists demonstrated the attack on moving vehicles and robots with the attacker placed about 15 feet away on the side of the road. But in theory it could be accomplished from farther away with upgraded equipment. The tech required is all fairly basic, but the laser must be perfectly timed to the lidar sensor and moving vehicles must be carefully tracked to keep the laser pointing in the right direction.

“It’s primarily a matter of synchronization of the laser with the lidar device. The information you need is usually publicly available from the manufacturer,” said S. Hrushikesh Bhupathiraj, a UF doctoral student in Rampazzi’s lab and one of the lead authors of the study.

Using this technique, the scientists were able to delete data for static obstacles and moving pedestrians. They also demonstrated with real-world experiments that the attack could follow a slow-moving vehicle using basic camera tracking equipment. In simulations of autonomous vehicle decision making, this deletion of data caused a car to continue accelerating toward a pedestrian it could no longer see instead of stopping as it should.

Updates to the lidar sensors or the software that interprets the raw data could address this vulnerability. For example, manufacturers could teach the software to look for the telltale signatures of the spoofed reflections added by the laser attack.

This research and videos are available online and will be presented at Usenix 2023.

Read full article here. Originally published by the UF News, November 2, 2022