Exploring spiking neural networks (SNN) for low Size, Weight, and Power (SWaP) benefits

Bihl, Trevor J. and Farr, Patrick and Di Caterina, Gaetano and Kirkland, Paul and Vicente Sola, Alex and Manna, Davide and Liu, Jundong and Combs, Kara; Bui, Tung X., ed. (2024) Exploring spiking neural networks (SNN) for low Size, Weight, and Power (SWaP) benefits. In: Proceedings of the 57th Hawaii International Conference on System Sciences. Proceedings of the Annual Hawaii International Conference on System Sciences . Shidler College of Business, University of Hawaii at Manoa, USA, pp. 7561-7570. ISBN 9780998133171 (https://hdl.handle.net/10125/107294)

[thumbnail of Bihl-etal-HICSS-2024-Exploring-spiking-neural-networks-for-low-Size-Weight-and-Power-benefits]
Preview
Text. Filename: Bihl-etal-HICSS-2024-Exploring-spiking-neural-networks-for-low-Size-Weight-and-Power-benefits.pdf
Final Published Version
License: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 logo

Download (641kB)| Preview

Abstract

Size, Weight, and Power (SWaP) concerns are growing as artificial intelligence (AI) use spreads in edge applications. AI algorithms, such as artificial neural networks (ANNs), have revolutionized many fields, e.g. computer vision (CV), but at a large computational/power burden. Biological intelligence is notably more computationally efficient. Neuromorphic edge processors and spiking neural networks (SNNs) aim to follow biology closer with spike-based operations resulting in sparsity and lower-SWaP operations than traditional ANNs with SNNs only “firing/spiking” when needed. Understanding the trade space of SWaP when embracing neuromorphic computing has not been studied heavily. To addresses this, we present a repeatable and scalable apples-to-apples comparison of traditional ANNs and SNNs for edge processing with demonstration on both classical and neuromorphic edge hardware. Results show that SNNs combined with neuromorphic hardware can provide comparable accuracy for CV to ANNs at 1/10th the power.