Document Type

Article

Publication Date

2018

Abstract

Restricted Boltzmann machines are a generative neural network. They summarize their input data to build a probabilistic model that can then be used to reconstruct missing data or to classify new data. Unlike discrete Boltzmann machines, where the data are mapped to the space of integers or bitstrings, continuous Boltzmann machines directly use floating point numbers and therefore represent the data with higher fidelity. The primary limitation in using Boltzmann machines for big-data problems is the efficiency of the training algorithm. This paper describes an efficient deterministic algorithm for training continuous machines.

Comments

Publisher version available at https://doi.org/10.1007/s11276-018-01903-6. The archived and publisher version may slightly differ due to copyediting.

Share

COinS