Distributed dynamics in neural networks

Andreas V. M. Herz and Charles M. Marcus
Phys. Rev. E 47, 2155 – Published 1 March 1993
PDFExport Citation

Abstract

We analyze the dynamics and statistical mechanics of attractor neural networks with ‘‘distributed’’ updating rules in which groups of one or more neurons are updated simultaneously. Such partially parallel updating schemes are a central feature of neural-network architectures that use many processors, implemented either on special multiprocessor hardware, or among many computers linked over a network. Several updating rules are classified and discussed; these rules generalize the parallel dynamics of the Little model and the one-at-a-time dynamics of the Hopfield model. Analytic results presented herein include a stability criterion that specifies sufficient conditions under which distributed dynamics lead to fixed-point attractors. For binary neurons with block-sequential updating and a Hebbian learning rule, the storage capacity is found as a function of the number of update groups. Several open problems are also discussed.

  • Received 5 October 1992

DOI:https://doi.org/10.1103/PhysRevE.47.2155

©1993 American Physical Society

Authors & Affiliations

Andreas V. M. Herz

  • Physics of Computation Laboratory, Division of Chemistry, California Institute of Technology 139-74, Pasadena, California 91125

Charles M. Marcus

  • Division of Applied Sciences and Department of Physics, Harvard University, Cambridge, Massachusetts 02138

References (Subscription Required)

Click to Expand
Issue

Vol. 47, Iss. 3 — March 1993

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review E

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×