An active learning framework for the low-frequency Non-Intrusive Load Monitoring problem

Todic, Tamara and Stankovic, Vladimir and Stankovic, Lina (2023) An active learning framework for the low-frequency Non-Intrusive Load Monitoring problem. Applied Energy, 341. 121078. ISSN 0306-2619 (

[thumbnail of Todic-etal-AE-2023-An-active-learning-framework-for-the-low-frequency-Non-Intrusive-Load-Monitoring-problem]
Text. Filename: Todic_etal_AE_2023_An_active_learning_framework_for_the_low_frequency_Non_Intrusive_Load_Monitoring_problem.pdf
Final Published Version
License: Creative Commons Attribution 4.0 logo

Download (1MB)| Preview


With the widespread deployment of smart meters worldwide, quantification of energy used by individual appliances via Non-Intrusive Load Monitoring (NILM), i.e., virtual submetering, is an emerging application to inform energy management within buildings. Low-frequency NILM refers to NILM algorithms designed to perform load disaggregation at sampling rates in the order of seconds and minutes, as per smart meter data availability. Recently, many deep learning solutions for NILM have appeared in the literature, with promising results. However, besides requiring large, labelled datasets, the proposed deep learning models are not flexible and usually under-perform when tested in a new environment, affecting scalability. The dynamic nature of appliance ownership and usage inhibits the performance of the developed supervised NILM models and requires large amounts of training data. Transfer learning approaches are commonly used to overcome this issue, but they often assume availability of good quality labelled data from the new environment. We propose an active learning framework, that is able to learn and update the deep learning NILM model from small amounts of data, for transfer to a new environment. We explore the suitability of different types of acquisition functions, which determines which function inputs are most valuable. Finally, we perform a sensitivity analysis of the hyperparameters on model performance. In addition, we propose a modification to the state-of-the-art BatchBALD acquisition function, to address its high computational complexity. Our proposed framework achieves optimal accuracy-labelling effort trade-off with only 5-15% of the query pool labelled. The results on the REFIT dataset, demonstrate the potential of the proposed active learning to improve transferability and reduce the cost of labelling. Unlike the common approach of retraining the entire model once a new set of labels is provided, we demonstrate that full re-training is not necessary, since a fine-tuning approach can offer a good trade-off between performance achieved and computational resources needed.