ISSN:
1433-3058
Keywords:
Backpropagation
;
Handwriting
;
Pattern recognition
Source:
Springer Online Journal Archives 1860-2000
Topics:
Computer Science
,
Mathematics
Notes:
Abstract In using a neural network for an application, data representation and network structure are critical to performance. While most improvements to networks focus on these aspects, we have found that modification of the error function based on current performance can result in significant advantages. We consider here a multilayered network trained by the backpropagation error reduction rule. We also consider a specific task, namely that of direct recognition of handwriting patterns, without any feature extraction to optimise the representation used. We show that the relaxation of the definition of error improves the final performance and accelerates learning. Since the application used in this study has generic qualities, we believe that the results of this numerical experiment are pertinent to a wide class of applications.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1007/BF01414945
Permalink