ISSN:
0219-3116
Keywords:
Keywords: Adaptive resonance theory; Fast learning; Field theory; Incremental learning; Machine learning; Neural networks
Source:
Springer Online Journal Archives 1860-2000
Topics:
Computer Science
Notes:
Abstract. In this paper, a fast adaptive neural network classifier named FANNC is proposed. FANNC exploits the advantages of both adaptive resonance theory and field theory. It needs only one-pass learning, and achieves not only high predictive accuracy but also fast learning speed. Besides, FANNC has incremental learning ability. When new instances are fed, it does not need to retrain the whole training set. Instead, it could learn the knowledge encoded in those instances through slightly adjusting the network topology when necessary, that is, adaptively appending one or two hidden units and corresponding connections to the existing network. This characteristic makes FANNC fit for real-time online learning tasks. Moreover, since the network architecture is adaptively set up, the disadvantage of manually determining the number of hidden units of most feed-forward neural networks is overcome. Benchmark tests show that FANNC is a preferable neural network classifier, which is superior to several other neural algorithms on both predictive accuracy and learning speed.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1007/s101150050006
Permalink