Improving knowledge distillation for non-intrusive load monitoring through explainability guided learning

Batic, Djordje and Tanoni, Giulia and Stankovic, Lina and Stankovic, Vladimir and Principi, Emanuele; (2023) Improving knowledge distillation for non-intrusive load monitoring through explainability guided learning. In: ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, Piscataway, NJ.. ISBN 9781728163277 (https://doi.org/10.1109/ICASSP49357.2023.10095109)

[thumbnail of Batic-etal-ICASSP-2023-Improving-knowledge-distillation-for-non-intrusive-load-monitoring]
Preview
Text. Filename: Batic_etal_ICASSP_2023_Improving_knowledge_distillation_for_non_intrusive_load_monitoring.pdf
Accepted Author Manuscript
License: Strathprints license 1.0

Download (216kB)| Preview

Abstract

Knowledge distillation (KD) is a machine learning technique widely used in recent years for the task of domain adaptation and complexity reduction. It relies on a Student-Teacher mechanism to transfer the knowledge of a large and complex Teacher network into a smaller Student model. Given the inherent complexity of large Deep Neural Network (DNN) models, and the need for deployment on edge devices with limited resources, complexity reduction techniques have become a hot topic in the Non-intrusive Load Monitoring (NILM) community. Recent literature in NILM has devoted increased effort to domain adaptation and architecture reduction via KD. However, the mechanism behind the transfer of knowledge from the Teacher to the Student is not clearly understood. In this work, we aim to address the aforementioned issue by placing the KD NILM approach in a framework of explainable AI (XAI). We identify the main inconsistency in the transfer of explainable knowledge, and exploit this information to propose a method for improvement of KD through explainability guided learning. We evaluate our approach on a variety of appliances and domain adaptation scenarios and demonstrate that solving inconsistencies in the transfer of explainable knowledge can lead to improvement in predictive performance.

ORCID iDs

Batic, Djordje ORCID logoORCID: https://orcid.org/0000-0002-7647-6641, Tanoni, Giulia, Stankovic, Lina ORCID logoORCID: https://orcid.org/0000-0002-8112-1976, Stankovic, Vladimir ORCID logoORCID: https://orcid.org/0000-0002-1075-2420 and Principi, Emanuele;