Academic
Publications
Learning internal representations by error propagation

Learning internal representations by error propagation,David E. Rumelhart,Geoffrey E. Hinton,Ronald J. Williams

Learning internal representations by error propagation   (Citations: 6590)
BibTex | RIS | RefWorks Download
This paper presents a generalization of the perception learning procedure for learning the correct sets of connections for arbitrary networks. The rule, falled the generalized delta rule, is a simple scheme for implementing a gradient descent method for finding weights that minimize the sum squared error of the sytem's performance. The major theoretical contribution of the work is the procedure called error propagation, whereby the gradient can be determined by individual units of the network based only on locally available information. The major empirical contribution of the work is to show that the problem of local minima not serious in this application of gradient descent. Keywords: Learning; networks; Perceptrons; Adaptive systems; Learning machines; and Back propagation
Cumulative Annual
Sort by: