Deterministic Boltzmann Learning Performs Steepest Descent in Weight-Space

被引:95
作者
Hinton, Geoffrey E. [1 ]
机构
[1] Univ Toronto, Dept Comp Sci, 10 Kings Coll Rd, Toronto, ON M5S 1A4, Canada
关键词
D O I
10.1162/neco.1989.1.1.143
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Boltzmann machine learning procedure has been successfully applied in deterministic networks of analog units that use a mean field approximation to efficiently simulate a truly stochastic system (Peterson and Anderson 1987). This type of "deterministic Boltzmann machine" (DBM) learns much faster than the equivalent "stochastic Boltzmann machine" (SBM), but since the learning procedure for DBM's is only based on an analogy with SBM's, there is no existing proof that it performs gradient descent in any function, and it has only been justified by simulations. By using the appropriate interpretation for the way in which a DBM represents the probability of an output vector given an input vector, it is shown that the DBM performs steepest descent in the same function as the original SBM, except at rare discontinuities. A very simple way of forcing the weights to become symmetrical is also described, and this makes the DBM more biologically plausible than back-propagation (Werbos 1974; Parker 1985; Rumelhart et al. 1986).
引用
收藏
页码:143 / 150
页数:8
相关论文
共 7 条
  • [1] NEURONS WITH GRADED RESPONSE HAVE COLLECTIVE COMPUTATIONAL PROPERTIES LIKE THOSE OF 2-STATE NEURONS
    HOPFIELD, JJ
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA-BIOLOGICAL SCIENCES, 1984, 81 (10): : 3088 - 3092
  • [2] McClelland JL, 1986, PARALLEL DISTRIBUTED, V1, P1
  • [3] PARKER DB, 1985, TR47 MIT SLOAN SCH M
  • [4] Peterson C., 1987, Complex Systems, V1, P995
  • [5] LEARNING REPRESENTATIONS BY BACK-PROPAGATING ERRORS
    RUMELHART, DE
    HINTON, GE
    WILLIAMS, RJ
    [J]. NATURE, 1986, 323 (6088) : 533 - 536
  • [6] Werbos P., 1974, REGRESSION NEW TOOLS
  • [7] [No title captured]