Discrepancy-Based Theory and Algorithms for Forecasting Non-Stationary Time Series

We present data-dependent learning bounds for the general scenario of non-stationary non-mixing stochastic processes. Our learning guarantees are expressed in terms of a data-dependent measure of sequential complexity and a discrepancy measure that can be estimated from data under some mild assumpti...

Full description

Saved in:
Bibliographic Details
Published inAnnals of mathematics and artificial intelligence Vol. 88; no. 4; pp. 367 - 399
Main Authors Kuznetsov, Vitaly, Mohri, Mehryar
Format Journal Article
LanguageEnglish
Published Cham Springer International Publishing 01.04.2020
Springer
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1012-2443
1573-7470
DOI10.1007/s10472-019-09683-1

Cover

Loading…
More Information
Summary:We present data-dependent learning bounds for the general scenario of non-stationary non-mixing stochastic processes. Our learning guarantees are expressed in terms of a data-dependent measure of sequential complexity and a discrepancy measure that can be estimated from data under some mild assumptions. Our learning bounds guide the design of new algorithms for non-stationary time series forecasting for which we report several favorable experimental results.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1012-2443
1573-7470
DOI:10.1007/s10472-019-09683-1