Skip to main content
Log in

Pattern-recognition by an artificial network derived from biologic neuronal systems

  • Published:
Biological Cybernetics Aims and scope Submit manuscript

Abstract

A novel artificial neural network, derived from neurobiological observations, is described and examples of its performance are presented. This DYnamically STable Associative Learning (DYSTAL) network associatively learns both correlations and anticorrelations, and can be configured to classify or restore patterns with only a change in the number of output units. DYSTAL exhibits some particularly desirable properties: computational effort scales linearly with the number of connections, i.e., it is0(N) in complexity; performance of the network is stable with respect to network parameters over wide ranges of their values and over the size of the input field; storage of a very large number of patterns is possible; patterns need not be orthogonal; network connections are not restricted to multi-layer feed-forward or any other specific structure; and, for a known set of deterministic input patterns, the network weights can be computed, a priori, in closed form. The network has been associatively trained to perform the XOR function as well as other classification tasks. The network has also been trained to restore patterns obscured by binary or analog noise. Neither global nor local feedback connections are required during learning; hence the network is particularly suitable for hardware (VLSI) implementation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Akaike H (1959) On a successive transformation of probability distribution and its application to the analysis of the optimal gradient method. Ann Inst Stat Math Tokyo 11:1–16

    Google Scholar 

  • Alkon DL (1983) Learning in a marine snail. Sci Am 249:70–84

    Google Scholar 

  • Alkon DL (1984) Calcium-mediated reduction of ionic currents: a biophysical memory trace. Science 226:1037–1045

    Google Scholar 

  • Alkon DL (1987) Memory traces in the brain. Cambridge University Press, Cambridge

    Google Scholar 

  • Alkon DL (1989) Memory storage and neural systems. Sci Am July: 42–50

  • Alkon DL, Rasmussen H (1988) A spatial-temporal model of cell activation. Science 239:998–1005

    Google Scholar 

  • Alkon DL, Quek F, Vogl TP (1988) Computer modeling of associative learning. In: Touretzky DS (ed) Advances in neural information processing systems I. Morgan-Kaufmann, San Mateo, Calif

    Google Scholar 

  • Anderson JA (1983) Cognitive and psychological computation with neural models. IEEE Trans SMC-13:799–815

    Google Scholar 

  • Anderson JA (1986) Cognitive capabilities of a parallel system. In: Bienenstock E, Fegelman Soulie F, Weisbuch G (eds) Disordered systems and biological organization. Springer, Berlin Heidelberg New York

    Google Scholar 

  • Bank B, Gurd JW, Chute DL (1986) Decreased phosphorylation of synaptic glycoproteins following hippocampal kindling. Brain Res 399:390–394

    Google Scholar 

  • Bank B, DeWeer A, Kuzirian AM, Rasmussen H, Alkon DL (1988) Classical conditioning induces long-term translocation of protein kinase C in rabbit hippocampal CA1 cells. Proc Natl Acad Sci USA 85:1988–1992

    Google Scholar 

  • Chiang T, Chow Y (1988) On eigenvalues and annealing rates. Math Op Res 13:508–511

    Google Scholar 

  • Coulter DA, Lo Turco JJ, Kubota M, Disterhoft JF, Alkon DL (1989) Classical conditioning reduces amplitude and duration of calcium-dependent after hyperpolarization in rabbit hippocampal pyramidal cells. J Neurophysiol 61:971–981

    Google Scholar 

  • Dåhlquist G, Björck A (1974) Numerical methods. Prentice-Hall, Englewood Cliffs

    Google Scholar 

  • Disterhoft JF, Coulter DA, Alkon DL (1986) Conditioning-specific membrane changes of rabbit hippocampal neurons measured in vitro. Proc Natl Acad Sci USA 83:2733–2737

    Google Scholar 

  • Fukushima K (1988) Neocognitron: A Hierarchical Neural Network Capable of Visual Pattern Recognition. Neural Netw 1:119–130

    Google Scholar 

  • Grossberg S (1982) Studies of mind and brain. Reidel, Dordrecht

    Google Scholar 

  • Grossberg S (1987) Competitive learning: from interactive activation to adaptive resonance. Cogn Sci 11:23–63

    Google Scholar 

  • Hebb DO (1949) The organization of behavior. Wiley, New York

    Google Scholar 

  • Hopfield JJ (1982) Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci USA 79:2554–2558

    Google Scholar 

  • Hopfield JJ, Tank DW (1986) Computing with neural circuits: a model. Science 233:625–633

    Google Scholar 

  • Kienker PK, Sejnowski TJ, Hinton GE, Schumacher LE (1986) Separating figure from ground with parallel network. Perception 15:197–216

    Google Scholar 

  • Klopf H (1988) A neuronal model of classical conditioning. Psychobiology 16:85–125

    Google Scholar 

  • Kohonen T, Makisara K (1989) The self-organizing feature maps. Phys Scr 39:168–172

    Google Scholar 

  • LoTurco JL, Coulter DA, Alkon DL (1988) Enhancement of synaptic potentials in rabbit CA1 pyramidal neurons following classical conditioning. Proc Natl Acad Sci USA 85:1672–1676

    Google Scholar 

  • McClelland JL (1985) Putting knowledge in its place: a scheme for programming parallel processing structures on the fly. Cogn Sci 9:113–146

    Google Scholar 

  • McClelland JL, Rumelhart DE (1981) An interactive activation model of context effects in letter perception: part 1. An account of basic findings. Psychol Rev 88:375–407

    Google Scholar 

  • Olds JL, Anderson ML, McPhie DL, Staten LD, Alkon DL (1989) Imaging memory-specific changes in the distribution of protein kinase C within the Hippocampus. Science 245:866–869

    Google Scholar 

  • Press WH, Flannery BP, Teukolsky SA, Vetterling WT (1986) Numerical recipes, Cambridge University Press, Cambridge

    Google Scholar 

  • Rosenblatt F (1959) Two theorems of statistical separability in the perceptron. In: Mechanization of thought processes: Proceedings of a symposium held at the National Physical Laboratory, November 1958. 1:421–456 HM Stationary Office

  • Roth MW (1988) Neural network technology and its applications. Johns Hopkins APL Tech Digest 9:242–253

    Google Scholar 

  • Rumelhart DE, McClelland JL (1986) (eds) Parallel distributed processing: explorations in the microstructure of cognition, vol I: Foundations; vol II: Applications. MIT Press, Cambridge

    Google Scholar 

  • Rumelhart DE, Zipser D (1985) Feature discovery by competitive learning. Cogn Sci 9:75–112

    Google Scholar 

  • Sejnowski TJ, Kienker PK (1986) Learning symmetry groups with hidden units: Beyond the perceptron. Physica D 22:260–275

    Google Scholar 

  • Sejnowski TJ, Rosenberg CR (1986) NETtalk: A parallel network that learns to read aloud. The Johns-Hopkins University Electrical Engineering and Computer Technical Report JHU/EECS-86/01

  • Vogl TP, Mangis JK, Rigler AK, Zink WT, Alkon DL (1988) Accelerating the convergence of the back-propagation method. Biol Cybern 59:257–263

    Google Scholar 

  • Zink WT, Vogl TP, Mangis JK (1988) Neural networks as classifiers of noisy patterns: an experimental comparison with Bayesian classifiers. Presented at the First National Conference of the International Neural Network Society, Boston, Mass

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Alkon, D.L., Blackwell, K.T., Barbour, G.S. et al. Pattern-recognition by an artificial network derived from biologic neuronal systems. Biol. Cybern. 62, 363–376 (1990). https://doi.org/10.1007/BF00197642

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00197642

Keywords

Navigation