Realization of RBF Neural Network with Local and Semi-Local Transfer Function Approximation and Classification

Article Preview

Abstract:

Incremental Neural Network (IncNet) structure is controlled by the growth and pruning, and the complexity of the match and training data. Dual radial transfer function is more flexible than other commonly transfer function used in artificial neural network. Recent improvements in the multi-dimensional space (having the N-1 parameters) to increase the rotation of the transfer function of the constant value. Based on the results of the benchmark approach and psychological classification analysis clearly shows than any other classification network model has a stronger generalization.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

1628-1632

Citation:

Online since:

November 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] R. Adamczak, W. Duch, and N. Jankowski. New developments in the Feature Space Mapping model. In Third Conference on Neural Networks and Their Applications, pages 65–70, Kule, Poland, Oct. (1997).

Google Scholar

[2] W. Duch and N. Jankowski. Survey of neural transfer functions. Neural Computing Surveys, 7, 1999. (submitted).

Google Scholar

[3] F. Girosi. An equivalence between sparse approximation and support vector machines. Neural Computation, 10(6), Aug. (1998).

DOI: 10.1162/089976698300017269

Google Scholar

[4] N. Jankowski. Ontogenic neural networks and their applications to classification of medical data. PhD thesis, Department of Computer Methods, Nicholas Copernicus University, Torun, Poland, 1999. (in preparation).

Google Scholar