» Articles » PMID: 17556116

Learning Grammatical Structure with Echo State Networks

Overview
Journal Neural Netw
Specialties Biology
Neurology
Date 2007 Jun 9
PMID 17556116
Citations 14
Authors
Affiliations
Soon will be listed here.
Abstract

Echo State Networks (ESNs) have been shown to be effective for a number of tasks, including motor control, dynamic time series prediction, and memorizing musical sequences. However, their performance on natural language tasks has been largely unexplored until now. Simple Recurrent Networks (SRNs) have a long history in language modeling and show a striking similarity in architecture to ESNs. A comparison of SRNs and ESNs on a natural language task is therefore a natural choice for experimentation. Elman applies SRNs to a standard task in statistical NLP: predicting the next word in a corpus, given the previous words. Using a simple context-free grammar and an SRN with backpropagation through time (BPTT), Elman showed that the network was able to learn internal representations that were sensitive to linguistic processes that were useful for the prediction task. Here, using ESNs, we show that training such internal representations is unnecessary to achieve levels of performance comparable to SRNs. We also compare the processing capabilities of ESNs to bigrams and trigrams. Due to some unexpected regularities of Elman's grammar, these statistical techniques are capable of maintaining dependencies over greater distances than might be initially expected. However, we show that the memory of ESNs in this word-prediction task, although noisy, extends significantly beyond that of bigrams and trigrams, enabling ESNs to make good predictions of verb agreement at distances over which these methods operate at chance. Overall, our results indicate a surprising ability of ESNs to learn a grammar, suggesting that they form useful internal representations without learning them.

Citing Articles

Energy-Efficient Edge and Cloud Image Classification with Multi-Reservoir Echo State Network and Data Processing Units.

Lopez-Ortiz E, Perea-Trigo M, Soria-Morillo L, Alvarez-Garcia J, Vegas-Olmos J Sensors (Basel). 2024; 24(11).

PMID: 38894431 PMC: 11175351. DOI: 10.3390/s24113640.


Optimizing echo state networks for continuous gesture recognition in mobile devices: A comparative study.

Yadav A, Pasupa K, Loo C, Liu X Heliyon. 2024; 10(5):e27108.

PMID: 38562498 PMC: 10982987. DOI: 10.1016/j.heliyon.2024.e27108.


Evolutionary aspects of reservoir computing.

Seoane L Philos Trans R Soc Lond B Biol Sci. 2019; 374(1774):20180377.

PMID: 31006369 PMC: 6553587. DOI: 10.1098/rstb.2018.0377.


The combination of circle topology and leaky integrator neurons remarkably improves the performance of echo state network on time series prediction.

Xue F, Li Q, Li X PLoS One. 2017; 12(7):e0181816.

PMID: 28759581 PMC: 5536322. DOI: 10.1371/journal.pone.0181816.


Reservoir Computing Properties of Neural Dynamics in Prefrontal Cortex.

Enel P, Procyk E, Quilodran R, Dominey P PLoS Comput Biol. 2016; 12(6):e1004967.

PMID: 27286251 PMC: 4902312. DOI: 10.1371/journal.pcbi.1004967.