Using traditional image kernels and image processing techniques to harden convolutional neural networks against adversarial attacks
Kiggins, Andrew and Edu, Jide; Manulis, Mark, ed. (2025) Using traditional image kernels and image processing techniques to harden convolutional neural networks against adversarial attacks. In: Applied Cryptography and Network Security Workshops. Lecture Notes in Computer Science . Springer, DEU, pp. 185-203. ISBN 9783032017994 (https://doi.org/10.1007/978-3-032-01799-4_11)
Preview |
Text.
Filename: Kiggins-Edu-Springer-ACNS-2025-Using-traditional-image-kernels-and-image-processing-techniques-to-harden-convolutional-neural-networks.pdf
Accepted Author Manuscript License:
Download (2MB)| Preview |
Abstract
Convolutional Neural Networks (CNNs) are the primary image classification method, especially in safety-critical physical environments such as autonomous driving and industrial automation. However, CNNs are vulnerable to adversarial attacks in which noise could be added to an image to deceive the classifier. This can lead CNN to make incorrect predictions with high confidence, posing significant threats to physical systems. Although there are various defense mechanisms against adversarial attacks, many are unsuitable for safety-critical applications due to possible image degradation or high computational costs. In this research, we investigate the use of traditional image denoising techniques as a defense against adversarial attacks in environments with limited computational resources. We evaluated three denoising methods: Median filtering, Gaussian filtering, and the Markov chain Monte Carlo (MCMC) method, under “real-world conditions”. The results demonstrate that these three methods not only reduce the impact of adversarial attacks but also surpass the state-of-the-art defense technique, APE-GAN, in speed while preserving prediction accuracy. Our findings show that traditional denoising techniques could provide a practical and efficient defense against adversarial attacks in low-power, safety-critical systems.
ORCID iDs
Kiggins, Andrew and Edu, Jide
ORCID: https://orcid.org/0000-0003-1325-8740;
Manulis, Mark
-
-
Item type: Book Section ID code: 94546 Dates: DateEvent23 October 2025Published22 October 2025Published Online17 March 2025AcceptedSubjects: Science > Mathematics > Electronic computers. Computer science
?? QA76-890 ??Department: Faculty of Science > Computer and Information Sciences Depositing user: Pure Administrator Date deposited: 28 Oct 2025 10:21 Last modified: 06 Jan 2026 09:11 Related URLs: URI: https://strathprints.strath.ac.uk/id/eprint/94546
Tools
Tools






