Mitigating Drift in Time Series Data with Noise Augmentation

Machine leaning (ML) models must be accurate to produce quality AI solutions. There must be high accuracy in the data and with the model that is built using the data. Online machine learning algorithms fits naturally with use cases that involves time series data. In online environments the data dist...

Full description

Saved in:
Bibliographic Details
Published in2019 International Conference on Computational Science and Computational Intelligence (CSCI) pp. 227 - 230
Main Authors Fields, Tonya, Hsieh, George, Chenou, Jules
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.12.2019
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Machine leaning (ML) models must be accurate to produce quality AI solutions. There must be high accuracy in the data and with the model that is built using the data. Online machine learning algorithms fits naturally with use cases that involves time series data. In online environments the data distribution can change over time producing what is known as concept drift. Real-life, real-time, machine learning algorithms operating in dynamic environments must be able to detect any drift or changes in the data distribution and adapt and update the ML model in the face of data that changes over time. In this paper we present the work of a simulated drift added to time series ML models. We simulate drift on Multiplayer perceptron (MLP), Long Short Term Memory (LSTM), Convolution Neural Networks (CNN) and Gated Recurrent Unit (GRU). Results show ML models with flavors of recurrent neural network (RNN) are less sensitive to drift compared to other models. By adding noise to the training set, we can recover accuracy of the model in the face of drift.
DOI:10.1109/CSCI49370.2019.00046