Efficient training of interval neural networks for imprecise training data

Sadeghi, Jonathan and de Angelis, M. and Patelli, Edoardo (2019) Efficient training of interval neural networks for imprecise training data. Neural Networks, 118. pp. 338-351. ISSN 0893-6080 (https://doi.org/10.1016/j.neunet.2019.07.005)

[thumbnail of Sadeghi-etal-NN-2019-Efficient-training-of-interval-neural-networks-for-imprecise]
Preview
Text. Filename: Sadeghi_etal_NN_2019_Efficient_training_of_interval_neural_networks_for_imprecise.pdf
Accepted Author Manuscript
License: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 logo

Download (721kB)| Preview

Abstract

This paper describes a robust and computationally feasible method to train and quantify the uncertainty of Neural Networks. Specifically, we propose a back propagation algorithm for Neural Networks with interval predictions. In order to maintain numerical stability we propose minimising the maximum of the batch of errors at each step. Our approach can accommodate incertitude in the training data, and therefore adversarial examples from a commonly used attack model can be trivially accounted for. We present results on a test function example, and a more realistic engineering test case. The reliability of the predictions of these networks is guaranteed by the non-convex Scenario approach to chance constrained optimisation, which takes place following training, and is hence robust to the performance of the optimiser. A key result is that, by using minibatches of size M, the complexity of the proposed approach scales as O(M⋅Niter), and does not depend upon the number of training data points as with other Interval Predictor Model methods. In addition, troublesome penalty function methods are avoided. To the authors’ knowledge this contribution presents the first computationally feasible approach for dealing with convex set based epistemic uncertainty in huge datasets.