Forecasting Energy Consumption of a Public Building Using Transformer and Support Vector Regression

Most of the Artificial Intelligence (AI) models currently used in energy forecasting are traditional and deterministic. Recently, a novel deep learning paradigm, called ‘transformer’, has been developed, which adopts the mechanism of self-attention. Transformers are designed to better process and pr...

Full description

Saved in:
Bibliographic Details
Published inEnergies (Basel) Vol. 16; no. 2; p. 966
Main Authors Huang, Junhui, Kaewunruen, Sakdirat
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.01.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Most of the Artificial Intelligence (AI) models currently used in energy forecasting are traditional and deterministic. Recently, a novel deep learning paradigm, called ‘transformer’, has been developed, which adopts the mechanism of self-attention. Transformers are designed to better process and predict sequential data sets (i.e., historical time records) as well as to track any relationship in the sequential data. So far, a few transformer-based applications have been established, but no industry-scale application exists to build energy forecasts. Accordingly, this study is the world’s first to establish a transformer-based model to estimate the energy consumption of a real-scale university library and benchmark with a baseline model (Support Vector Regression) SVR. With a large dataset from 1 September 2017 to 13 November 2021 with 30 min granularity, the results using four historical electricity readings to estimate one future reading demonstrate that the SVR (an R2 of 0.92) presents superior performance than the transformer-based model (an R2 of 0.82). Across the sensitivity analysis, the SVR model is more sensitive to the input close to the output. These findings provide new insights into the research area of energy forecasting in either a specific building or a building cluster in a city. The influences of the number of inputs and outputs related to the transformer-based model will be investigated in the future.
ISSN:1996-1073
1996-1073
DOI:10.3390/en16020966