Feb. 29, 2024, 5:46 a.m. | Yitong Sun, Yao Huang, Xingxing Wei

cs.CV updates on arXiv.org arxiv.org

arXiv:2312.09554v2 Announce Type: replace
Abstract: As physical adversarial attacks become extensively applied in unearthing the potential risk of security-critical scenarios, especially in autonomous driving, their vulnerability to environmental changes has also been brought to light. The non-robust nature of physical adversarial attack methods brings less-than-stable performance consequently. To enhance the robustness of physical adversarial attacks in the real world, instead of statically optimizing a robust adversarial example via an off-line training manner like the existing methods, this paper proposes a …

abstract adversarial adversarial attacks arxiv attack methods attacks autonomous autonomous driving become cs.cv driving dynamic embodied environmental light nature performance risk robust security type vulnerability

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist, Demography and Survey Science, University Grad

@ Meta | Menlo Park, CA | New York City

Computer Vision Engineer, XR

@ Meta | Burlingame, CA