Abstract
Networks with binary weights are very important from both the theoretical and practical points of view. In this paper, we investigate the clipped Hebb rule for learning different networks of non-overlapping binary perceptrons under uniform distribution. We calculate exactly its learning curves in the limit of a large number of inputs, where the average behaviour becomes the typical behaviour. This calculation is performed using only very simple counting arguments and the central limit theorem. The results indicate that the clipped Hebb rule does indeed learn this class of networks. In particular, the generalization rates converge extremely rapidly, often exponentially, to perfect generalization. These results are very encouraging given the simplicity of the learning rule. The analytic expression of the learning curves are in excellent agreement with the numerical simulations.