» Articles » PMID: 39843473

Investigating the Intrinsic Top-down Dynamics of Deep Generative Models

Overview
Journal Sci Rep
Date 2025 Jan 22
PMID 39843473
Authors
Affiliations
Soon will be listed here.
Abstract

Hierarchical generative models can produce data samples based on the statistical structure of their training distribution. This capability can be linked to current theories in computational neuroscience, which propose that spontaneous brain activity at rest is the manifestation of top-down dynamics of generative models detached from action-perception cycles. A popular class of hierarchical generative models is that of Deep Belief Networks (DBNs), which are energy-based deep learning architectures that can learn multiple levels of representations in a completely unsupervised way exploiting Hebbian-like learning mechanisms. In this work, we study the generative dynamics of a recent extension of the DBN, the iterative DBN (iDBN), which more faithfully simulates neurocognitive development by jointly tuning the connection weights across all layers of the hierarchy. We characterize the number of states visited during top-down sampling and investigate whether the heterogeneity of visited attractors could be increased by initiating the generation process from biased hidden states. To this end, we train iDBN models on well-known datasets containing handwritten digits and pictures of human faces, and show that the ability to generate diverse data prototypes can be enhanced by initializing top-down sampling from "chimera states", which represent high-level features combining multiple abstract representations of the sensory data. Although the models are not always able to transition between all potential target states within a single-generation trajectory, the iDBN shows richer top-down dynamics in comparison to a shallow generative model (a single-layer Restricted Bolzamann Machine). We further show that the generated samples can be used to support continual learning through generative replay mechanisms. Our findings suggest that the top-down dynamics of hierarchical generative models is significantly influenced by the shape of the energy function, which depends both on the depth of the processing architecture and on the statistical structure of the sensory data.

References
1.
Lazar A, Lewis C, Fries P, Singer W, Nikolic D . Visual exposure enhances stimulus encoding and persistence in primary cortex. Proc Natl Acad Sci U S A. 2021; 118(43). PMC: 8639370. DOI: 10.1073/pnas.2105276118. View

2.
van der Plas T, Tubiana J, Le Goc G, Migault G, Kunst M, Baier H . Neural assemblies uncovered by generative modeling explain whole-brain activity statistics and reflect structural connectivity. Elife. 2023; 12. PMC: 9940913. DOI: 10.7554/eLife.83139. View

3.
Pezzulo G, Zorzi M, Corbetta M . The secret life of predictive brains: what's spontaneous activity for?. Trends Cogn Sci. 2021; 25(9):730-743. PMC: 8363551. DOI: 10.1016/j.tics.2021.05.007. View

4.
Tognoli E, Kelso J . The metastable brain. Neuron. 2014; 81(1):35-48. PMC: 3997258. DOI: 10.1016/j.neuron.2013.12.022. View

5.
Richards B, Lillicrap T, Beaudoin P, Bengio Y, Bogacz R, Christensen A . A deep learning framework for neuroscience. Nat Neurosci. 2019; 22(11):1761-1770. PMC: 7115933. DOI: 10.1038/s41593-019-0520-2. View