Spatiotemporal data analysis with chronological networks

The number of spatiotemporal data sets has increased rapidly in the last years, which demands robust and fast methods to extract information from this kind of data. Here, we propose a network-based model, called Chronnet, for spatiotemporal data analysis. The network construction process consists of...

Full description

Saved in:
Bibliographic Details
Published inNature communications Vol. 11; no. 1; p. 4036
Main Authors Ferreira, Leonardo N., Vega-Oliveros, Didier A., Cotacallapa, Moshé, Cardoso, Manoel F., Quiles, Marcos G., Zhao, Liang, Macau, Elbert E. N.
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 12.08.2020
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The number of spatiotemporal data sets has increased rapidly in the last years, which demands robust and fast methods to extract information from this kind of data. Here, we propose a network-based model, called Chronnet, for spatiotemporal data analysis. The network construction process consists of dividing a geometric space into grid cells represented by nodes connected chronologically. Strong links in the network represent consecutive recurrent events between cells. The chronnet construction process is fast, making the model suitable to process large data sets. Using artificial and real data sets, we show how chronnets can capture data properties beyond simple statistics, like frequent patterns, spatial changes, outliers, and spatiotemporal clusters. Therefore, we conclude that chronnets represent a robust tool for the analysis of spatiotemporal data sets. Extracting central information from ever-growing data generated in our lives calls for new data mining methods. Ferreira et al. show a simple model, called chronnets, that can capture frequent patterns, spatial changes, outliers, and spatiotemporal clusters.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2041-1723
2041-1723
DOI:10.1038/s41467-020-17634-2