» Articles » PMID: 35700559

Approximation Properties of Gaussian-binary Restricted Boltzmann Machines and Gaussian-binary Deep Belief Networks

Overview
Journal Neural Netw
Specialties Biology
Neurology
Date 2022 Jun 14
PMID 35700559
Authors
Affiliations
Soon will be listed here.
Abstract

Despite the successful use of Gaussian-binary restricted Boltzmann machines (GB-RBMs) and Gaussian-binary deep belief networks (GB-DBNs), little is known about their theoretical approximation capabilities to represent distributions of continuous random variables. In this paper, we address the expressive properties of GB-RBMs and GB-DBNs, contributing theoretical insights to the optimal number of hidden variables. We first treat the GB-RBM's unnormalized log-likelihood as a sum of a special two-layer feedforward neural network and a negative quadratic term. Then, a series of simulation results are established, which can be used to relate GB-RBMs to general two-layer feedforward neural networks whose expressive properties are much better understood. On this basis, we show that a two-layer ReLU network with all weights in the second layer being 1, along with a negative quadratic term, can approximate all continuous functions. In addition, we provide qualified lower bounds for the number of hidden variables of GB-RBMs required to approximate distributions whose log-likelihood are given by some classes of smooth functions. Moreover, we further study the universal approximation of GB-DBNs with two hidden layers by providing a sufficient number of hidden variables O(ɛ) that are guaranteed to approximate any given strictly positive continuous distribution within a given error ɛ. Finally, numerical experiments are carried out to verify some of the proposed theoretical results.