Schematic of an attack that can remove lidar data from the area in front of the vehicle, resulting in unsafe vehicle movement. Below you can see the removal of the lidar data from the pedestrian in front of the vehicle, which is visible on the lower left but not visible on the right. Credit: Sara Rampazzi / University of Florida
Self-driving cars, like human drivers before them, need to see what’s around them to avoid obstacles and drive safely.
The most advanced autonomous vehicles typically use lidar, a rotating radar-type device that acts as the car’s eyes. Lidar provides continuous information about the distance to objects so that the car can decide which actions are safe to take.
But these eyes can be fooled.
New research reveals that expertly timed lasers that shine on an approaching lidar system can create a blind spot in front of a vehicle large enough to hide moving pedestrians and other obstacles. The deleted information makes cars think the road is safe to continue on, putting any potential attack in the blind spot at risk.
This is the first time that lidar sensors have been tricked into removing information about obstacles.
The vulnerability was discovered by researchers from the University of Florida, the University of Michigan and Japan’s University of Electronic Communications. Scientists also provide updates that can remove this weakness to protect people from malicious attacks.
The results will be presented at the 2023 USENIX Security Symposium and are currently being published arXiv.
Lidar works by sending out laser light and capturing the reflections to calculate distances, just as a bat echolocator uses sound echoes. The attack creates false reflections to confuse the sensor.
“We mimic lidar reflections with our laser so that the sensor reduces other reflections that come from real obstacles,” said Sara Rampazzi, a UF professor of computing and information technology who led the research. “The Lidar still receives genuine data from the obstacle, but the data is automatically discarded because the sensor detects our fake reflections.”
The researchers demonstrated the attack on moving vehicles and robots by placing the attacker about 15 meters from the edge of the road. But in theory it could be achieved even further with updated devices. The technology required is all fairly simple, but the laser needs to be perfectly timed to the lidar sensor and moving vehicles need to be carefully tracked to keep the laser pointing in the right direction.

Animated GIF showing how an attack uses a laser to inject spoof data points into a lidar sensor, causing it to reject genuine data from an obstacle in front of the sensor. Credit: Sara Rampazzi / University of Florida
“It’s primarily about synchronizing the laser with the lidar. The information you need is usually publicly available from the manufacturer,” said S. Hrushikesh Bhupathiraj, a UF doctoral student in Rampazzi’s lab and one of the study’s lead authors. .
Using this technique, researchers were able to remove data from static obstacles and moving pedestrians. They also demonstrated in real experiments that an attack could track a slow-moving vehicle using basic camera trackers. In simulations of autonomous vehicle decision-making, this removal of data caused the car to continue accelerating toward a pedestrian it no longer saw, instead of stopping as it should.

The attack removes information from the cone in front of the vehicle, making the moving pedestrian invisible to the lidar system in the area. Credit: Sara Rampazzi / University of Florida
Updates to Lidar sensors or the software that interprets the raw data can fix this vulnerability. For example, manufacturers could teach the software to look for telltale signs of decoy reflections added by the laser attack.
“Uncovering this liability allows us to build a more reliable system,” said Yulong Cao, a Michigan doctoral student and lead author of the study. “In our paper, we show that previous defense strategies are insufficient and propose changes that should remedy this weakness.”
Autonomous vehicles can be tricked into “seeing” non-existent obstacles
Yulong Cao et al, You Can’t See Me: Physical Removal Attacks on LiDAR-Based Driving Frameworks for Autonomous Vehicles, arXiv (2022). DOI: 10.48550/arxiv.2210.09482. arxiv.org/abs/2210.09482
arXiv
Provided by the University of Florida
Quotation: Laser attack blinds autonomous vehicles, deletes pedestrians and confuses cars (2022, October 31) Retrieved November 1, 2022, from https://techxplore.com/news/2022-10-laser-autonomous-vehicles-deleting-pedestrians.html
This document is subject to copyright. Except for fair trade for private study or research, no part may be reproduced without written permission. The content is for information only.
#laser #attack #blinds #autonomous #vehicles #removes #pedestrians #confuses #cars