Journal of Asian Architecture and Building Engineering (Nov 2024)
Sequential attention deep learning architecture with unsupervised pre-training for interpretable and accurate building energy prediction with limited data
Abstract
Predicting building energy consumption is important for energy efficiency and reducing carbon emissions. However, deep learning (DL) models for energy consumption forecasting often have limited predictive generalization and lack explainability. To address these challenges, this research adopts a sequential attention deep learning architecture (SADLA) that uses attention mechanisms to learn the importance of different features for predicting cooling energy consumption, which helps building occupants and managers to make informed decisions about energy optimization. The model was trained and tested on comprehensive and scarce datasets in terms of recording periods and predictor numbers. The data were derived from a three-month field experiment in a commercial office building in Seoul, Korea. Furthermore, models were constructed with other prevalent algorithms like Long Short-Term Memory, deep neural networks, and Extreme Gradient Boosting for comparative assessment. The results from the one- and two-week datasets and reduced features suggested that the SADLA-based models surpassed others in model generalization (R2 = 0.961, 0.967, and 0.976 respectively). However, all algorithms demonstrated comparable performance with the three-month dataset achieving an average R2 of 0.970. These findings underscore the potential of the adopted model in addressing data scarcity in existing and new buildings for accurate cooling energy prediction.
Keywords