Background
Type: Article

An Optimized Non-deep Learning Defense Against Adversarial Attacks for Pedestrian Detection

Journal: Journal of Signal Processing Systems (19398115)Year: December 2024Volume: 96Issue: Pages: 763 - 777
Etehadi-Abari M.Naghsh Nilchi A.a Hoseinnezhad R.
DOI:10.1007/s11265-024-01941-8Language: English

Abstract

Deep learning models often lack robustness against adversarial attacks, which may deceive classifiers and limit their use in safety-critical applications, such as pedestrian detection. The robustness of pedestrian detection methods, remains underexplored, and current defenses often prove unsuccessful due to their reliance on deep learning architectures. This paper introduces a pedestrian detection approach specifically designed to enhance robustness against adversarial attacks. Our method’s resilience stems from three key elements: First, it employs a novel hand-crafted feature extraction method that are less susceptible to minor perturbations compared to the irrelevant and vulnerable features extracted by deep learning models. Second, our non-deep model lacks gradients, thereby rendering many gradient-based adversarial attacks, such as FGSM and PGD attacks, ineffective. Third, employing a novel ant colony optimization technique with a tailored evaluation function, which selects resilient feature subsets. Extensive experiments demonstrate that our approach maintains comparable detection accuracy to state-of-the-art methods on clean data while exhibiting robustness against adversarial attacks. © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2025.