An Information-Theoretic Test for Dependence with an Application to the Temporal Structure of Stock Returns

Information theory provides ideas for conceptualising information and measuring relationships between objects. It has found wide application in the sciences, but economics and finance have made surprisingly little use of it. We show that time series data can usefully be studied as information -- by...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Sher, Galen, Vitoria, Pedro
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 01.05.2013
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Information theory provides ideas for conceptualising information and measuring relationships between objects. It has found wide application in the sciences, but economics and finance have made surprisingly little use of it. We show that time series data can usefully be studied as information -- by noting the relationship between statistical redundancy and dependence, we are able to use the results of information theory to construct a test for joint dependence of random variables. The test is in the same spirit of those developed by Ryabko and Astola (2005, 2006b,a), but differs from these in that we add extra randomness to the original stochatic process. It uses data compression to estimate the entropy rate of a stochastic process, which allows it to measure dependence among sets of random variables, as opposed to the existing econometric literature that uses entropy and finds itself restricted to pairwise tests of dependence. We show how serial dependence may be detected in S&P500 and PSI20 stock returns over different sample periods and frequencies. We apply the test to synthetic data to judge its ability to recover known temporal dependence structures.
ISSN:2331-8422