ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Biological cybernetics 70 (1993), S. 177-187 
    ISSN: 1432-0770
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Computer Science , Physics
    Notes: Abstract Recurrent neural networks with full symmetric connectivity have been extensively studied as associative memories and pattern recognition devices. However, there is considerable evidence that sparse, asymmetrically connected, mainly excitatory networks with broadly directed inhibition are more consistent with biological reality. In this paper, we use the technique of return maps to study the dynamics of random networks with sparse, asymmetric connectivity and nonspecific inhibition. These networks show three qualitatively different kinds of behavior: fixed points, cycles of low period, and extremely long cycles verging on aperiodicity. Using statistical arguments, we relate these behaviors to network parameters and present empirical evidence for the accuracy of this statistical model. The model, in turn, leads to methods for controlling the level of activity in networks. Studying random, untrained networks provides an understanding of the intrinsic dynamics of these systems. Such dynamics could provide a substrate for the much more complex behavior shown when synaptic modification is allowed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Biological cybernetics 70 (1993), S. 81-87 
    ISSN: 1432-0770
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Computer Science , Physics
    Notes: Abstract This report demonstrates the effectiveness of two processes in constructing simple feedforward networks which perform good transformations on their inputs. Good transformations are characterized by the minimization of two information measures: the information loss incurred with the transformation and the statistical dependency of the output. The two processes build appropriate synaptic connections in initially unconnected networks. The first process, synaptogenesis, creates new synaptic connections; the second process, associative synaptic modification, adjusts the connection strength of existing synapses. Synaptogenesis produces additional innervation for each output neuron until each output neuron achieves a firing rate of approximately 0.50. Associative modification of existing synaptic connections lends robustness to network construction by adjusting suboptimal choices of initial synaptic weights. Networks constructed using synaptogenesis and synaptic modification successfully preserve the information content of a variety of inputs. By recording a high-dimensional input into an output of much smaller dimension, these networks drastically reduce the statistical dependence of neuronal representations. Networks constructed with synaptogenesis and associative modification perform good transformations over a wide range of neuron firing thresholds.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Biological cybernetics 67 (1992), S. 469-477 
    ISSN: 1432-0770
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Computer Science , Physics
    Notes: Abstract This study compares the ability of excitatory, feed-forward neural networks to construct good transformations on their inputs. The quality of such a transformation is judged by the minimization of two information measures: the information loss of the transformation and the statistical dependency of the output. The networks that are compared differ from each other in the parametric properties of their neurons and in their connectivity. The particular network parameters studied are output firing threshold, synaptic connectivity, and associative modification of connection weights. The network parameters that most directly affect firing levels are threshold and connectivity. Networks incorporating neurons with dynamic threshold adjustment produce better transformations. When firing threshold is optimized, sparser synaptic connectivity produces a better transformation than denser connectivity. Associative modification of synaptic weights confers only a slight advantage in the construction of optimal transformations. Additionally, our research shows that some environments are better suited than others for recoding. Specifically, input environments high in statistical dependence, i.e. those environments most in need of recoding, are more likely to undergo successful transformations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Biological cybernetics 71 (1994), S. 461-468 
    ISSN: 1432-0770
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Computer Science , Physics
    Notes: Abstract.  This report continues our research into the effectiveness of adaptive synaptogenesis in constructing feed-forward networks which perform good transformations on their inputs. Good transformations are characterized by the maintenance of input information and the removal of statistical dependence. Adaptive synaptogenesis stochastically builds and sculpts a synaptic connectivity in initially unconnected networks using two mechanisms. The first, synaptogenesis, creates new, excitatory, feed-forward connections. The second, associative modification, adjusts the strength of existing synapses. Our previous implementations of synaptogenesis only incorporated a postsynaptic regulatory process, receptivity to new innervation (Adelsberger-Mangan and Levy 1993a, b). In the present study, a presynaptic regulatory process, presynaptic avidity, which regulates the tendency of a presynaptic neuron to participate in a new synaptic connection as a function of its total synaptic weight, is incorporated into the synaptogenesis process. In addition, we investigate a third mechanism, selective synapse removal. This process removes synapses between neurons whose firing is poorly correlated. Networks that are constructed with the presynaptic regulatory process maintain more information and remove more statistical dependence than networks constructed with postsynaptic receptivity and associative modification alone. Selective synapse removal also improves network performance, but only when implemented in conjunction with the presynaptic regulatory process.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Biological cybernetics 74 (1996), S. 159-165 
    ISSN: 1432-0770
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Computer Science , Physics
    Notes: Abstract.  This paper investigates how noise affects a minimal computational model of the hippocampus and, in particular, region CA3. The architecture and physiology employed are consistent with the known anatomy and physiology of this region. Here, we use computer simulations to demonstrate and quantify the ability of this model to create context codes in sequential learning problems. These context codes are mediated by local context neurons which are analogous to hippocampal place-coding cells. These local context neurons endow the network with many of its problem-solving abilities. Our results show that the network encodes context on its own and then uses context to solve sequence prediction under ambiguous conditions. Noise during learning affects performance, and it also affects the development of context codes. The relationship between noise and performance in a sequence prediction is simple and corresponds to a disruption of local context neuron firing. As noise exceeds the signal, sequence completion and local context neuron firing are both lost. For the parameters investigated, extra learning trials and slower learning rates do not overcome either of the effects of noise. The results are consistent with the important role played, in this hippocampal model, by local context neurons in sequence prediction and for disambiguation across time.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Biological cybernetics 74 (1996), S. 159-165 
    ISSN: 1432-0770
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Computer Science , Physics
    Notes: Abstract This paper investigates how noise affects a minimal computational model of the hippocampus and, in particular, region CA3. The architecture and physiology employed are consistent with the known anatomy and physiology of this region. Here, we use computer simulations to demonstrate and quantify the ability of this model to create context codes in sequential learning problems. These context codes are mediated by local context neurons which are analogous to hippocampal place-coding cells. These local context neurons endow the network with many of its problem-solving abilities. Our results show that the network encodes context on its own and then uses context to solve sequence prediction under ambiguous conditions. Noise during learning affects performance, and it also affects the development of context codes. The relationship between noise and performance in a sequence prediction is simple and corresponds to a disruption of local context neuron firing. As noise exceeds the signal, sequence completion and local context neuron firing are both lost. For the parameters investigated, extra learning trials and slower learning rates do not overcome either of the effects of noise. The results are consistent with the important role played, in this hippocampal model, by local context neurons in sequence prediction and for disambiguation across time.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Springer
    Biological cybernetics 71 (1994), S. 461-468 
    ISSN: 1432-0770
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Computer Science , Physics
    Notes: Abstract This report continues our research into the effectiveness of adaptive synaptogenesis in constructing feed-forward networks which perform good transformations on their inputs. Good transformations are characterized by the maintenance of input information and the removal of statistical dependence. Adaptive synaptogenesis stochastically builds and sculpts a synaptic connectivity in initially unconnected networks using two mechanisms. The first, synaptogenesis, creates new, excitatory, feed-forward connections. The second, associative modification, adjusts the strength of existing synapses. Our previous implementations of synaptogenesis only incorporated a postsynaptic regulatory process, receptivity to new innervation (Adelsberger-Mangan and Levy 1993a, b). In the present study, a presynaptic regulatory process, presynaptic avidity, which regulates the tendency of a presynaptic neuron to participate in a new synaptic connection as a function of its total synaptic weight, is incorporated into the synaptogenesis process. In addition, we investigate a third mechanism, selective synapse removal. This process removes synapses between neurons whose firing is poorly correlated. Networks that are constructed with the presynaptic regulatory process maintain more information and remove more statistical dependence than networks constructed with postsynaptic receptivity and associative modification alone. Selective synapse removal also improves network performance, but only when implemented in conjunction with the presynaptic regulatory process.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Biological cybernetics 79 (1998), S. 203-213 
    ISSN: 1432-0770
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Computer Science , Physics
    Notes: Abstract. Using computer simulations, this paper investigates how input codes affect a minimal computational model of the hippocampal region CA3. Because encoding context seems to be a function of the hippocampus, we have studied problems that require learning context for their solution. Here we study a hippocampally dependent, configural learning problem called transverse patterning. Previously, we showed that the network does not produce long local context codings when the sequential input patterns are orthogonal, and it fails to solve many context-dependent problems in such situations. Here we show that this need not be the case if we assume that the input changes more slowly than a processing interval. Stuttering, i.e., repeating inputs, allows the network to create long local context firings even for orthogonal inputs. With these long local context firings, the network is able to solve the transverse patterning problem. Without stuttering, transverse patterning is not learned. Because stuttering is so useful, we investigate the relationship between the stuttering repetition length and relative context length in a simple, idealized sequence prediction problem. The relative context length, defined as the average length of the local context codes divided by the stuttering length, interacts with activity levels and has an optimal stuttering repetition length. Moreover, the increase in average context length can reach this maximum without loss of relative capacity. Finally, we note that stuttering is an example of maintained or introduced redundancy that can improve neural computations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    ISSN: 1432-0770
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Computer Science , Physics
    Notes: Abstract. It is desirable to have a statistical description of neuronal connectivity in developing tractable theories on the development of biological neural networks and in designing artificial neural networks. In this paper, we bring out a relationship between the statistics of the input environment, the degree of network connectivity, and the average postsynaptic activity. These relationships are derived using simple neurons whose inputs are only feed-forward, excitatory and whose activity is a linear function of its inputs. In particular, we show that only the empirical mean of the pairwise input correlations, rather than the full matrix of all such correlations, is needed to produce an accurate estimate of the number of inputs necessary to attain a prespecified average postsynaptic activity level. Predictions from this work also include distributional aspects of connectivity and activity as shown by a combination of analysis and simulations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Electronic Resource
    Electronic Resource
    Springer
    Annals of biomedical engineering 21 (1993), S. 739-740 
    ISSN: 1573-9686
    Source: Springer Online Journal Archives 1860-2000
    Topics: Medicine , Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...