Transient Adversarial 3D Projection Attacks on Object Detection in Autonomous Driving
Abstract
Object detection is a crucial task in autonomous driving. While existing research has proposed various attacks on object detection, such as those using adversarial patches or stickers, the exploration of projection attacks on 3D surfaces remains largely unexplored. Compared to adversarial patches or stickers, which have fixed adversarial patterns, projection attacks allow for transient modifications to these patterns, enabling a more flexible attack. In this paper, we introduce an adversarial 3D projection attack specifically targeting object detection in autonomous driving scenarios. We frame the attack formulation as an optimization problem, utilizing a combination of color mapping and geometric transformation models. Our results demonstrate the effectiveness of the proposed attack in deceiving YOLOv3 and Mask R-CNN in physical settings. Evaluations conducted in an indoor environment show an attack success rate of up to 100% under low ambient light conditions, highlighting the potential damage of our attack in real-world driving scenarios.
Recommended Citation
C. Zhou et al., "Transient Adversarial 3D Projection Attacks on Object Detection in Autonomous Driving," Lecture Notes of the Institute for Computer Sciences Social Informatics and Telecommunications Engineering Lnicst, vol. 622 LNICST, pp. 259 - 278, Springer, Jan 2025.
The definitive version is available at https://doi.org/10.1007/978-3-031-93354-7_12
Department(s)
Computer Science
Keywords and Phrases
3D projection attack; Adverarial patch; Autonmous driving; Object detection
International Standard Book Number (ISBN)
978-303193353-0
International Standard Serial Number (ISSN)
1867-822X; 1867-8211
Document Type
Article - Conference proceedings
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2025 Springer, All rights reserved.
Publication Date
01 Jan 2025

Comments
National Science Foundation, Grant CNS-2235231