» Articles » PMID: 34474484

A Correspondence Between Normalization Strategies in Artificial and Biological Neural Networks

Overview
Journal Neural Comput
Publisher MIT Press
Date 2021 Sep 2
PMID 34474484
Citations 4
Authors
Affiliations
Soon will be listed here.
Abstract

A fundamental challenge at the interface of machine learning and neuroscience is to uncover computational principles that are shared between artificial and biological neural networks. In deep learning, normalization methods such as batch normalization, weight normalization, and their many variants help to stabilize hidden unit activity and accelerate network training, and these methods have been called one of the most important recent innovations for optimizing deep networks. In the brain, homeostatic plasticity represents a set of mechanisms that also stabilize and normalize network activity to lie within certain ranges, and these mechanisms are critical for maintaining normal brain function. In this article, we discuss parallels between artificial and biological normalization methods at four spatial scales: normalization of a single neuron's activity, normalization of synaptic weights of a neuron, normalization of a layer of neurons, and normalization of a network of neurons. We argue that both types of methods are functionally equivalent-that is, both push activation patterns of hidden units toward a homeostatic state, where all neurons are equally used-and we argue that such representations can improve coding capacity, discrimination, and regularization. As a proof of concept, we develop an algorithm, inspired by a neural normalization technique called synaptic scaling, and show that this algorithm performs competitively against existing normalization methods on several data sets. Overall, we hope this bidirectional connection will inspire neuroscientists and machine learners in three ways: to uncover new normalization algorithms based on established neurobiological principles; to help quantify the trade-offs of different homeostatic plasticity mechanisms used in the brain; and to offer insights about how stability may not hinder, but may actually promote, plasticity.

Citing Articles

Machine-Learning-Based Identification of Key Feature RNA-Signature Linked to Diagnosis of Hepatocellular Carcinoma.

Matboli M, Diab G, Saad M, Khaled A, Roushdy M, Ali M J Clin Exp Hepatol. 2024; 14(6):101456.

PMID: 39055616 PMC: 11268357. DOI: 10.1016/j.jceh.2024.101456.


Building transformers from neurons and astrocytes.

Kozachkov L, Kastanenka K, Krotov D Proc Natl Acad Sci U S A. 2023; 120(34):e2219150120.

PMID: 37579149 PMC: 10450673. DOI: 10.1073/pnas.2219150120.


Distinctive properties of biological neural networks and recent advances in bottom-up approaches toward a better biologically plausible neural network.

Jeon I, Kim T Front Comput Neurosci. 2023; 17:1092185.

PMID: 37449083 PMC: 10336230. DOI: 10.3389/fncom.2023.1092185.


Rethinking the Role of Normalization and Residual Blocks for Spiking Neural Networks.

Ikegawa S, Saiin R, Sawada Y, Natori N Sensors (Basel). 2022; 22(8).

PMID: 35458860 PMC: 9028401. DOI: 10.3390/s22082876.

References
1.
Pozzorini C, Naud R, Mensi S, Gerstner W . Temporal whitening by power-law adaptation in neocortical neurons. Nat Neurosci. 2013; 16(7):942-8. DOI: 10.1038/nn.3431. View

2.
Sanchez-Giraldo L, Laskar M, Schwartz O . Normalization and pooling in hierarchical models of natural images. Curr Opin Neurobiol. 2019; 55:65-72. DOI: 10.1016/j.conb.2019.01.008. View

3.
Laughlin S, Sejnowski T . Communication in neuronal networks. Science. 2003; 301(5641):1870-4. PMC: 2930149. DOI: 10.1126/science.1089662. View

4.
Poggio T, Liao Q, Banburski A . Complexity control by gradient descent in deep networks. Nat Commun. 2020; 11(1):1027. PMC: 7039878. DOI: 10.1038/s41467-020-14663-9. View

5.
Wiskott L, Sejnowski T . Constrained optimization for neural map formation: a unifying framework for weight growth and normalization. Neural Comput. 1998; 10(3):671-716. DOI: 10.1162/089976698300017700. View