FPGA acceleration of a quantized neural network for remote-sensed cloud detection

Reiter, Philippe and Karagiannakis, Philipp and Ireland, Murray and Greenland, Steve and Crockett, Louise (2020) FPGA acceleration of a quantized neural network for remote-sensed cloud detection. In: 7th International Workshop on On-Board Payload Data Compression, 2020-09-21 - 2020-09-23, Virtual.

[thumbnail of Reiter-etal-OBPDC-2020-FPGA-acceleration-of-a-quantized-neural-network-for-remote-sensed]
Preview
Text. Filename: Reiter_etal_OBPDC_2020_FPGA_acceleration_of_a_quantized_neural_network_for_remote_sensed.pdf
Accepted Author Manuscript

Download (322kB)| Preview

Abstract

The capture and transmission of remote-sensed imagery for Earth observation is both computationally and bandwidth expensive. In the analyses of remote-sensed imagery in the visual band, atmospheric cloud cover can obstruct up to two-thirds of observations, resulting in costly imagery being discarded. Mission objectives and satellite operational details vary; however, assuming a cloud-free observation requirement, a doubling of useful data downlinked with an associated halving of delivery cost is possible through effective cloud detection. A minimal-resource, real-time inference neural network is ideally suited to perform automatic cloud detection, both for pre-processing captured images prior to transmission and preventing unnecessary images being taken by larger payload sensors. Much of the hardware complexity of modern neural network implementations resides in high-precision floating-point calculation pipelines. In recent years, research has been conducted in identifying quantized, or low-integer precision equivalents to known deep learning models, which do not require the extensive resources of their floating-point, full-precision counterparts. Our work leverages existing research on binary and quantized neural networks to develop a real-time, remote-sensed cloud detection solution using a commodity field-programmable gate array. This follows on developments of the Forwards Looking Imager for predictive cloud detection developed by Craft Prospect, a space engineering practice based in Glasgow, UK. The synthesized cloud detection accelerator achieved an inference throughput of 358.1 images per second with a maximum power consumption of 2.4 W. This throughput is an order of magnitude faster than alternate algorithmic options for the Forwards Looking Imager at around one third reduction in classification accuracy, and approximately two orders of magnitude faster than the CloudScout deep neural network, deployed with HyperScout 2 on the European Space Agency PhiSat-1 mission. Strategies for incorporating fault tolerance mechanisms are expounded.

ORCID iDs

Reiter, Philippe, Karagiannakis, Philipp, Ireland, Murray, Greenland, Steve and Crockett, Louise ORCID logoORCID: https://orcid.org/0000-0003-4436-0254;