Complexity Research on Second Class Orthogonal Weight Function Neural Networks

Article Preview

Abstract:

Weight function neural network is a new kind of neural network developed in recent years, which has many advantages, such as finding globe minima directly, good performance of generalization, extracting some useful information inherent in the problems and so on. Time complexity is an important measure of algorithm. This paper studies the complexity of neural network using second class orthogonal weight functions. The results indicate that the neural network has a linear relationship with the dimensions of input layer and output layer, an O (n3) relationship with the number of samples. Finally gives some simulation experiments for time complexity.

You might also be interested in these eBooks

Info:

Periodical:

Advanced Materials Research (Volumes 989-994)

Pages:

2659-2662

Citation:

Online since:

July 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] Zhang Dai-yuan. New Theories and Methods on Neural Networks [M]. Beijing: Tsinghua University Press, (2006).

Google Scholar

[2] J. G. Yu, J.E. Lefebvre. Guided waves in multilayered hollow cylinders: The improved Legendre polynomial method[J]. Composite Structures, 2013(95): 419-429.

DOI: 10.1016/j.compstruct.2012.07.012

Google Scholar

[3] Antonio J. Duran, Vanesa Sanchez-Canales. Orthogonal matrix polynomials whose differences are also orthogonal[J]. Journal of Approximation Theory, 2014: 112-127.

DOI: 10.1016/j.jat.2013.11.012

Google Scholar

[4] Caner Ozcan, Baha Sen. Investigation of the performance of LU decomposition method using CUDA [C]. First World Conference on Innovation and Computer Sciences, 2012(1): 50-54.

Google Scholar