The Boltzmann Machine: A Survey and Generalization
Abstract:
A tutorial is presented describing a general machine learning theory which spawns a class of energy minimizing machines useful in model identification, optimization, and associative memory. Special realizations of the theory include the Boltzmann machine and the Hopfield neural network. The theory is reinforced by appendices addressing particular facets of the machine, ranging from gradient descent to simulated annealing. The treatment is systematic, beginning with the description of the energy function. A defining relationship is established between the energy function and the optimal solution. Following, both classical and new learning algorithms are presented directing the adaption of the free parameters for numerically minimizing such function to yield the optimal solution. Finally, both computational burden and performance are assessed for several small-scale applications to date. Keywords Neural networks, Boltzmann machine, Gibbs machine, Energy minimizing neural networks, Simulated annealing.