» Articles » PMID: 34548491

Next Generation Reservoir Computing

Overview
Journal Nat Commun
Specialty Biology
Date 2021 Sep 22
PMID 34548491
Citations 45
Authors
Affiliations
Soon will be listed here.
Abstract

Reservoir computing is a best-in-class machine learning algorithm for processing information generated by dynamical systems using observed time-series data. Importantly, it requires very small training data sets, uses linear optimization, and thus requires minimal computing resources. However, the algorithm uses randomly sampled matrices to define the underlying recurrent neural network and has a multitude of metaparameters that must be optimized. Recent results demonstrate the equivalence of reservoir computing to nonlinear vector autoregression, which requires no random matrices, fewer metaparameters, and provides interpretable results. Here, we demonstrate that nonlinear vector autoregression excels at reservoir computing benchmark tasks and requires even shorter training data sets and training time, heralding the next generation of reservoir computing.

Citing Articles

Toward a physics-guided machine learning approach for predicting chaotic systems dynamics.

Feng L, Liu Y, Shi B, Liu J Front Big Data. 2025; 7:1506443.

PMID: 39897066 PMC: 11782262. DOI: 10.3389/fdata.2024.1506443.


Efficient optimisation of physical reservoir computers using only a delayed input.

Picco E, Jaurigue L, Ludge K, Massar S Commun Eng. 2025; 4(1):3.

PMID: 39827312 PMC: 11742992. DOI: 10.1038/s44172-025-00340-6.


Novel efficient reservoir computing methodologies for regular and irregular time series classification.

Li Z, Andreev A, Hramov A, Blyuss O, Zaikin A Nonlinear Dyn. 2025; 113(5):4045-4062.

PMID: 39822383 PMC: 11732944. DOI: 10.1007/s11071-024-10244-3.


Principled neuromorphic reservoir computing.

Kleyko D, Kymn C, Thomas A, Olshausen B, Sommer F, Frady E Nat Commun. 2025; 16(1):640.

PMID: 39809739 PMC: 11733134. DOI: 10.1038/s41467-025-55832-y.


Ultrafast silicon photonic reservoir computing engine delivering over 200 TOPS.

Wang D, Nie Y, Hu G, Tsang H, Huang C Nat Commun. 2024; 15(1):10841.

PMID: 39738199 PMC: 11686264. DOI: 10.1038/s41467-024-55172-3.


References
1.
Franz M, Scholkopf B . A unifying view of wiener and volterra theory and polynomial kernel regression. Neural Comput. 2006; 18(12):3097-118. DOI: 10.1162/neco.2006.18.12.3097. View

2.
Livi L, Bianchi F, Alippi C . Determination of the Edge of Criticality in Echo State Networks Through Fisher Information Maximization. IEEE Trans Neural Netw Learn Syst. 2017; 29(3):706-717. DOI: 10.1109/TNNLS.2016.2644268. View

3.
Bompas S, Georgeot B, Guery-Odelin D . Accuracy of neural networks for the simulation of chaotic dynamics: Precision of training data vs precision of the algorithm. Chaos. 2020; 30(11):113118. DOI: 10.1063/5.0021264. View

4.
Maass W, Natschlager T, Markram H . Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput. 2002; 14(11):2531-60. DOI: 10.1162/089976602760407955. View

5.
Wikner A, Pathak J, Hunt B, Girvan M, Arcomano T, Szunyogh I . Combining machine learning with knowledge-based modeling for scalable forecasting and subgrid-scale closure of large, complex, spatiotemporal systems. Chaos. 2020; 30(5):053111. DOI: 10.1063/5.0005541. View