Generative Feedback Explains Distinct Brain Activity Codes for Seen and Mental Images
Overview
Affiliations
The relationship between mental imagery and vision is a long-standing problem in neuroscience. Currently, it is not known whether differences between the activity evoked during vision and reinstated during imagery reflect different codes for seen and mental images. To address this problem, we modeled mental imagery in the human brain as feedback in a hierarchical generative network. Such networks synthesize images by feeding abstract representations from higher to lower levels of the network hierarchy. When higher processing levels are less sensitive to stimulus variation than lower processing levels, as in the human brain, activity in low-level visual areas should encode variation in mental images with less precision than seen images. To test this prediction, we conducted an fMRI experiment in which subjects imagined and then viewed hundreds of spatially varying naturalistic stimuli. To analyze these data, we developed imagery-encoding models. These models accurately predicted brain responses to imagined stimuli and enabled accurate decoding of their position and content. They also allowed us to compare, for every voxel, tuning to seen and imagined spatial frequencies, as well as the location and size of receptive fields in visual and imagined space. We confirmed our prediction, showing that, in low-level visual areas, imagined spatial frequencies in individual voxels are reduced relative to seen spatial frequencies and that receptive fields in imagined space are larger than in visual space. These findings reveal distinct codes for seen and mental images and link mental imagery to the computational abilities of generative networks.
Reinstatement and transformation of memory traces for recognition.
Rau E, Fellner M, Heinen R, Zhang H, Yin Q, Vahidi P Sci Adv. 2025; 11(8):eadp9336.
PMID: 39970226 PMC: 11838014. DOI: 10.1126/sciadv.adp9336.
Metacontrol Regulates Creative Thinking: An EEG Complexity Analysis Based on Multiscale Entropy.
Qi H, Liu C Brain Sci. 2024; 14(11).
PMID: 39595857 PMC: 11592368. DOI: 10.3390/brainsci14111094.
Maintenance and transformation of representational formats during working memory prioritization.
Pacheco-Estefan D, Fellner M, Kunz L, Zhang H, Reinacher P, Roy C Nat Commun. 2024; 15(1):8234.
PMID: 39300141 PMC: 11412997. DOI: 10.1038/s41467-024-52541-w.
Neuronal representation of visual working memory content in the primate primary visual cortex.
Huang J, Wang T, Dai W, Li Y, Yang Y, Zhang Y Sci Adv. 2024; 10(24):eadk3953.
PMID: 38875332 PMC: 11177929. DOI: 10.1126/sciadv.adk3953.
The attentive reconstruction of objects facilitates robust object recognition.
Ahn S, Adeli H, Zelinsky G PLoS Comput Biol. 2024; 20(6):e1012159.
PMID: 38870125 PMC: 11175536. DOI: 10.1371/journal.pcbi.1012159.