Neural State Space Alignment for Magnitude Generalization in Humans and Recurrent Networks
Overview
Affiliations
A prerequisite for intelligent behavior is to understand how stimuli are related and to generalize this knowledge across contexts. Generalization can be challenging when relational patterns are shared across contexts but exist on different physical scales. Here, we studied neural representations in humans and recurrent neural networks performing a magnitude comparison task, for which it was advantageous to generalize concepts of "more" or "less" between contexts. Using multivariate analysis of human brain signals and of neural network hidden unit activity, we observed that both systems developed parallel neural "number lines" for each context. In both model systems, these number state spaces were aligned in a way that explicitly facilitated generalization of relational concepts (more and less). These findings suggest a previously overlooked role for neural normalization in supporting transfer of a simple form of abstract relational knowledge (magnitude) in humans and machine learning systems.
Linking neural population formatting to function.
Ruff D, Markman S, Kim J, Cohen M bioRxiv. 2025; .
PMID: 39803479 PMC: 11722384. DOI: 10.1101/2025.01.03.631242.
A geometrical solution underlies general neural principle for serial ordering.
Di Antonio G, Raglio S, Mattia M Nat Commun. 2024; 15(1):8238.
PMID: 39300106 PMC: 11413371. DOI: 10.1038/s41467-024-52240-6.
Fascianelli V, Battista A, Stefanini F, Tsujimoto S, Genovesio A, Fusi S Nat Commun. 2024; 15(1):6479.
PMID: 39090091 PMC: 11294567. DOI: 10.1038/s41467-024-50503-w.
Shared structure facilitates working memory of multiple sequences.
Huang Q, Luo H Elife. 2024; 12.
PMID: 39046319 PMC: 11268885. DOI: 10.7554/eLife.93158.
Toddlers strategically adapt their information search.
Poli F, Li Y, Naidu P, Mars R, Hunnius S, Ruggeri A Nat Commun. 2024; 15(1):5780.
PMID: 38987261 PMC: 11237003. DOI: 10.1038/s41467-024-48855-4.