Electronic Resource
Oxford, UK and Boston, USA
:
Blackwell Publishers Ltd
Expert systems
17 (2000), S. 0
ISSN:
1468-0394
Source:
Blackwell Publishing Journal Backfiles 1879-2005
Topics:
Computer Science
Notes:
Connectionist network learning of context-free languages has so far been applied only to very simple cases and has often made use of an external stack. Learning complex context-free languages with a homogeneous neural mechanism looks like a much harder problem. The current paper takes a step toward solving this problem by analyzing context-free grammar computation (without addressing learning) in a class of analog computers called dynamical automata, which are naturally implemented in connectionist networks. The result is a widely applicable method of using fractal sets to organize infinite-state computations in a bounded state space. An appealing consequence is the development of parameter-space maps, which locate various complex computers in spatial relationships to one another. An example suggests that such a global perspective on the organization of the parameter space may be helpful for solving the hard problem of getting connectionist networks to learn complex grammars from examples.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1111/1468-0394.00126
|
Location |
Call Number |
Expected |
Availability |