» Articles » PMID: 33444316

On Transformative Adaptive Activation Functions in Neural Networks for Gene Expression Inference

Overview
Journal PLoS One
Date 2021 Jan 14
PMID 33444316
Citations 4
Authors
Affiliations
Soon will be listed here.
Abstract

Gene expression profiling was made more cost-effective by the NIH LINCS program that profiles only ∼1, 000 selected landmark genes and uses them to reconstruct the whole profile. The D-GEX method employs neural networks to infer the entire profile. However, the original D-GEX can be significantly improved. We propose a novel transformative adaptive activation function that improves the gene expression inference even further and which generalizes several existing adaptive activation functions. Our improved neural network achieves an average mean absolute error of 0.1340, which is a significant improvement over our reimplementation of the original D-GEX, which achieves an average mean absolute error of 0.1637. The proposed transformative adaptive function enables a significantly more accurate reconstruction of the full gene expression profiles with only a small increase in the complexity of the model and its training procedure compared to other methods.

Citing Articles

Gene expression prediction based on neighbour connection neural network utilizing gene interaction graphs.

Li X, Zhang X, He W, Bu D, Zhang S PLoS One. 2023; 18(2):e0281286.

PMID: 36745614 PMC: 9901809. DOI: 10.1371/journal.pone.0281286.


Adaptive cognition implemented with a context-aware and flexible neuron for next-generation artificial intelligence.

Jadaun P, Cui C, Liu S, Incorvia J PNAS Nexus. 2023; 1(5):pgac206.

PMID: 36712357 PMC: 9802372. DOI: 10.1093/pnasnexus/pgac206.


On tower and checkerboard neural network architectures for gene expression inference.

Kunc V, Klema J BMC Genomics. 2020; 21(Suppl 5):454.

PMID: 33327945 PMC: 7739475. DOI: 10.1186/s12864-020-06821-6.


Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks.

Jagtap A, Kawaguchi K, Karniadakis G Proc Math Phys Eng Sci. 2020; 476(2239):20200334.

PMID: 32831616 PMC: 7426042. DOI: 10.1098/rspa.2020.0334.

References
1.
Lancashire L, Lemetre C, Ball G . An introduction to artificial neural networks in bioinformatics--application to complex microarray and mass spectrometry datasets in cancer studies. Brief Bioinform. 2009; 10(3):315-29. DOI: 10.1093/bib/bbp012. View

2.
Guarnieri S, Piazza F, Uncini A . Multilayer feedforward networks with adaptive spline activation function. IEEE Trans Neural Netw. 2008; 10(3):672-83. DOI: 10.1109/72.761726. View

3.
Ritchie M, Phipson B, Wu D, Hu Y, Law C, Shi W . limma powers differential expression analyses for RNA-sequencing and microarray studies. Nucleic Acids Res. 2015; 43(7):e47. PMC: 4402510. DOI: 10.1093/nar/gkv007. View

4.
Yamada T, Yabuta T . Neural network controller using autotuning method for nonlinear functions. IEEE Trans Neural Netw. 1992; 3(4):595-601. DOI: 10.1109/72.143373. View

5.
Goh S, Mandic D . Recurrent neural networks with trainable amplitude of activation functions. Neural Netw. 2003; 16(8):1095-100. DOI: 10.1016/S0893-6080(03)00139-4. View