» Articles » PMID: 21031031

A General and Efficient Method for Incorporating Precise Spike Times in Globally Time-driven Simulations

Overview
Specialty Neurology
Date 2010 Oct 30
PMID 21031031
Citations 29
Authors
Affiliations
Soon will be listed here.
Abstract

Traditionally, event-driven simulations have been limited to the very restricted class of neuronal models for which the timing of future spikes can be expressed in closed form. Recently, the class of models that is amenable to event-driven simulation has been extended by the development of techniques to accurately calculate firing times for some integrate-and-fire neuron models that do not enable the prediction of future spikes in closed form. The motivation of this development is the general perception that time-driven simulations are imprecise. Here, we demonstrate that a globally time-driven scheme can calculate firing times that cannot be discriminated from those calculated by an event-driven implementation of the same model; moreover, the time-driven scheme incurs lower computational costs. The key insight is that time-driven methods are based on identifying a threshold crossing in the recent past, which can be implemented by a much simpler algorithm than the techniques for predicting future threshold crossings that are necessary for event-driven approaches. As run time is dominated by the cost of the operations performed at each incoming spike, which includes spike prediction in the case of event-driven simulation and retrospective detection in the case of time-driven simulation, the simple time-driven algorithm outperforms the event-driven approaches. Additionally, our method is generally applicable to all commonly used integrate-and-fire neuronal models; we show that a non-linear model employing a standard adaptive solver can reproduce a reference spike train with a high degree of precision.

Citing Articles

Learning heterogeneous delays in a layer of spiking neurons for fast motion detection.

Grimaldi A, Perrinet L Biol Cybern. 2023; 117(4-5):373-387.

PMID: 37695359 DOI: 10.1007/s00422-023-00975-8.


Editorial: Neuroscience, computing, performance, and benchmarks: Why it matters to neuroscience how fast we can compute.

Aimone J, Awile O, Diesmann M, Knight J, Nowotny T, Schurmann F Front Neuroinform. 2023; 17:1157418.

PMID: 37064716 PMC: 10098318. DOI: 10.3389/fninf.2023.1157418.


Simulations of working memory spiking networks driven by short-term plasticity.

Tiddia G, Golosio B, Fanti V, Paolucci P Front Integr Neurosci. 2022; 16:972055.

PMID: 36262372 PMC: 9574057. DOI: 10.3389/fnint.2022.972055.


Routing Brain Traffic Through the Von Neumann Bottleneck: Parallel Sorting and Refactoring.

Pronold J, Jordan J, Wylie B, Kitayama I, Diesmann M, Kunkel S Front Neuroinform. 2022; 15:785068.

PMID: 35300490 PMC: 8921864. DOI: 10.3389/fninf.2021.785068.


Dynamical Characteristics of Recurrent Neuronal Networks Are Robust Against Low Synaptic Weight Resolution.

Dasbach S, Tetzlaff T, Diesmann M, Senk J Front Neurosci. 2022; 15:757790.

PMID: 35002599 PMC: 8740282. DOI: 10.3389/fnins.2021.757790.


References
1.
Naud R, Marcille N, Clopath C, Gerstner W . Firing patterns in the adaptive exponential integrate-and-fire model. Biol Cybern. 2008; 99(4-5):335-47. PMC: 2798047. DOI: 10.1007/s00422-008-0264-7. View

2.
Plesser H, Diesmann M . Simplicity and efficiency of integrate-and-fire neuron models. Neural Comput. 2009; 21(2):353-9. DOI: 10.1162/neco.2008.03-08-731. View

3.
van Elburg R, van Ooyen A . Generalization of the event-based Carnevale-Hines integration scheme for integrate-and-fire models. Neural Comput. 2009; 21(7):1913-30. DOI: 10.1162/neco.2009.07-08-815. View

4.
Eppler J, Helias M, Muller E, Diesmann M, Gewaltig M . PyNEST: A Convenient Interface to the NEST Simulator. Front Neuroinform. 2009; 2:12. PMC: 2636900. DOI: 10.3389/neuro.11.012.2008. View

5.
Kuhn A, Aertsen A, Rotter S . Neuronal integration of synaptic input in the fluctuation-driven regime. J Neurosci. 2004; 24(10):2345-56. PMC: 6729484. DOI: 10.1523/JNEUROSCI.3349-03.2004. View