State-of-the-art optical-based physical adversarial attacks for deep learning computer vision systems
暂无分享,去创建一个
[1] Zheng Wang,et al. Physical Adversarial Attack meets Computer Vision: A Decade Survey , 2022, ArXiv.
[2] Chen-Hao Hu,et al. Adversarial Zoom Lens: A Novel Physical-World Attack to DNNs , 2022, ArXiv.
[3] Hao Li,et al. Adversarial Attack and Defense: A Survey , 2022, Electronics.
[4] Deming Zhai,et al. Shadows can be Dangerous: Stealthy and Effective Physical-world Adversarial Attack by Natural Phenomenon , 2022, 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[5] Xiaolin Hu,et al. Adversarial Texture for Fooling Person Detectors in the Physical World , 2022, 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[6] C. Joo,et al. Engineering pupil function for optical adversarial attacks. , 2022, Optics Express.
[7] Wenji Mao,et al. Adversarial Perturbation Defense on Deep Neural Networks , 2021, ACM Comput. Surv..
[8] Stanley H. Chan,et al. Optical Adversarial Attack , 2021, 2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW).
[9] Ajmal Mian,et al. Advances in adversarial attacks and defenses in computer vision: A survey , 2021, IEEE Access.
[10] Tianyou Chai,et al. Intelligent Manufacturing for the Process Industry Driven by Industrial Artificial Intelligence , 2021, Engineering.
[11] Jilin Li,et al. Adv-Makeup: A New Imperceptible and Transferable Attack on Face Recognition , 2021, IJCAI.
[12] Yujie Li,et al. Adaptive Square Attack: Fooling Autonomous Cars With Adversarial Traffic Signs , 2021, IEEE Internet of Things Journal.
[13] Xingxing Wei,et al. Adversarial Sticker: A Stealthy Attack Method in the Physical World , 2021, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[14] Yuan He,et al. Adversarial Laser Beam: Effective Physical-World Attack to DNNs in a Blink , 2021, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[15] Debdeep Mukhopadhyay,et al. A survey on adversarial attacks and defences , 2021, CAAI Trans. Intell. Technol..
[16] Felix Heide,et al. Adversarial Imaging Pipelines , 2021, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[17] Jianmin Li,et al. Fooling thermal infrared pedestrian detectors in real world using small bulbs , 2021, AAAI.
[18] Asaf Shabtai,et al. The Translucent Patch: A Physical and Universal Attack on Object Detectors , 2020, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[19] Z. L. Jiang,et al. An Illumination Modulation-Based Adversarial Attack Against Automated Face Recognition System , 2020, Inscrypt.
[20] Haibin Ling,et al. SPAA: Stealthy Projector-based Adversarial Attacks on Deep Image Classifiers , 2020, 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).
[21] Earlence Fernandes,et al. Invisible Perturbations: Physical Adversarial Examples Exploiting the Rolling Shutter Effect , 2020, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[22] Ivan Martinovic,et al. SLAP: Improving Physical Adversarial Examples with Short-Lived Adversarial Perturbations , 2020, USENIX Security Symposium.
[23] Qi Alfred Chen,et al. Towards Robust LiDAR-based Perception in Autonomous Driving: General Black-box Adversarial Sensor Attack and Countermeasures , 2020, USENIX Security Symposium.
[24] Tommaso Di Noia,et al. A survey on Adversarial Recommender Systems: from Attack/Defense strategies to Generative Adversarial Networks , 2020 .
[25] Xianglong Liu,et al. Bias-Based Universal Adversarial Patch Attack for Automatic Check-Out , 2020, ECCV.
[26] Hao Yang,et al. Adversarial Light Projection Attacks on Face Recognition Systems: A Feasibility Study , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).
[27] Shie Mannor,et al. Over-the-Air Adversarial Flickering Attacks against Video Recognition Networks , 2020, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[28] Cihang Xie,et al. Universal Physical Camouflage Attacks on Object Detectors , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[29] Xiaojiang Du,et al. VLA: A Practical Visible Light-based Attack on Face Recognition Systems in Physical World , 2019, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[30] Sergio Casas,et al. End-To-End Interpretable Neural Motion Planner , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[31] J. Zico Kolter,et al. Adversarial camera stickers: A physical camera-based attack on deep learning systems , 2019, ICML.
[32] Nicole Nichols,et al. Projecting Trouble: Light based Adversarial Attacks on Deep Learning Classifiers , 2018, AAAI Fall Symposium: ALEC.
[33] Jiliang Zhang,et al. Adversarial Examples: Opportunities and Challenges , 2018, IEEE Transactions on Neural Networks and Learning Systems.
[34] Nicholas Paul,et al. Application of HDR algorithms to solve direct sunlight problems when autonomous vehicles using machine vision systems are driving into sun , 2018, Comput. Ind..
[35] Atul Prakash,et al. Robust Physical-World Attacks on Deep Learning Visual Classification , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[36] Xiaofeng Wang,et al. Invisible Mask: Practical Attacks on Face Recognition with Infrared , 2018, ArXiv.
[37] David A. Wagner,et al. Obfuscated Gradients Give a False Sense of Security: Circumventing Defenses to Adversarial Examples , 2018, ICML.
[38] Ajmal Mian,et al. Threat of Adversarial Attacks on Deep Learning in Computer Vision: A Survey , 2018, IEEE Access.
[39] Quoc V. Le,et al. Intriguing Properties of Adversarial Examples , 2017, ICLR.
[40] Kouichi Sakurai,et al. One Pixel Attack for Fooling Deep Neural Networks , 2017, IEEE Transactions on Evolutionary Computation.
[41] David A. Wagner,et al. Towards Evaluating the Robustness of Neural Networks , 2016, 2017 IEEE Symposium on Security and Privacy (SP).
[42] Samy Bengio,et al. Adversarial examples in the physical world , 2016, ICLR.
[43] Ananthram Swami,et al. The Limitations of Deep Learning in Adversarial Settings , 2015, 2016 IEEE European Symposium on Security and Privacy (EuroS&P).
[44] Thomas Brox,et al. U-Net: Convolutional Networks for Biomedical Image Segmentation , 2015, MICCAI.
[45] Jonathon Shlens,et al. Explaining and Harnessing Adversarial Examples , 2014, ICLR.
[46] Ming Yang,et al. DeepFace: Closing the Gap to Human-Level Performance in Face Verification , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.
[47] Joan Bruna,et al. Intriguing properties of neural networks , 2013, ICLR.
[48] Yanjie Li,et al. A Physical-World Adversarial Attack Against 3D Face Recognition , 2022, arXiv.org.
[49] Chen-Hao Hu,et al. Adversarial Neon Beam: Robust Physical-World Adversarial Attack to DNNs , 2022, ArXiv.
[50] Liehuang Zhu,et al. Effective and Robust Physical-World Attacks on Deep Learning Face Recognition Systems , 2021, IEEE Transactions on Information Forensics and Security.
[51] Yanmao Man,et al. Poster: Perceived Adversarial Examples , 2019 .