Acceleration for both Boltzmann machine learning and mean field theory learning

Novel learning algorithms for both the Boltzmann machine (BM) and mean field theory (MFT) are proposed to accelerate learning. The effectiveness of the proposed MFT algorithm is confirmed by computer simulations. It accelerates convergence: for example, the proposed MFT algorithm is more than twice as fast as the conventional MFT algorithm when the learning constant eta is 0.1 or less. In addition, it is shown that the proposed algorithm is less sensitive to eta . MFT is more biologically plausible and more suitable for VLSI implementation than the other supervised learning paradigms, and it can also be used as a content addressable memory.<<ETX>>