A method for unsupervised learning of coherent spatiotemporal patterns in multiscale data

一种用于无监督学习多尺度数据中连贯时空模式的方法

阅读:1

Abstract

The unsupervised and principled diagnosis of multiscale data is a fundamental obstacle in modern scientific problems from, for instance, weather and climate prediction, neurology, epidemiology, and turbulence. Multiscale data are characterized by a combination of processes acting along multiple dimensions simultaneously, spatiotemporal scales across orders of magnitude, nonstationarity, and/or invariances such as translation and rotation. Existing methods are not well-suited to multiscale data, usually requiring supervised strategies such as human intervention, extensive tuning, or selection of ideal time periods. We present the multiresolution coherent spatio-temporal scale separation (mrCOSTS), a hierarchical and automated algorithm for the diagnosis of coherent patterns or modes in multiscale data. mrCOSTS is a variant of dynamic mode decomposition which decomposes data into bands of spatial patterns with shared time dynamics, thereby providing a robust method for analyzing multiscale data. It requires no training but instead takes advantage of the hierarchical nature of multiscale systems. We demonstrate mrCOSTS using complex multiscale datasets that are canonically difficult to analyze: 1) climate patterns of sea surface temperature, 2) electrophysiological observations of neural signals of the motor cortex, and 3) horizontal wind in the mountain boundary layer. With mrCOSTS, we trivially retrieve complex dynamics that were previously difficult to resolve while additionally extracting hitherto unknown patterns of activity embedded in the dynamics, allowing for advancing the understanding of these fields of study. This method is an important advancement for addressing the multiscale data which characterize many of the grand challenges in science and engineering.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。