Learning curves of the clipped Hebb rule for networks with binary weights

and

Published under licence by IOP Publishing Ltd
, , Citation M Golea and M Marchand 1993 J. Phys. A: Math. Gen. 26 5751 DOI 10.1088/0305-4470/26/21/015

0305-4470/26/21/5751

Abstract

Networks with binary weights are very important from both the theoretical and practical points of view. In this paper, we investigate the clipped Hebb rule for learning different networks of non-overlapping binary perceptrons under uniform distribution. We calculate exactly its learning curves in the limit of a large number of inputs, where the average behaviour becomes the typical behaviour. This calculation is performed using only very simple counting arguments and the central limit theorem. The results indicate that the clipped Hebb rule does indeed learn this class of networks. In particular, the generalization rates converge extremely rapidly, often exponentially, to perfect generalization. These results are very encouraging given the simplicity of the learning rule. The analytic expression of the learning curves are in excellent agreement with the numerical simulations.

Export citation and abstract BibTeX RIS

10.1088/0305-4470/26/21/015