A New Kind of Infinite Series and its Applications on Neural Networks

Article Preview

Abstract:

A new important theorem for function expansion and a new infinite series based on the new expansion is proposed. Unlike constant derivatives at a point by Taylors expansion, the terms are functions associated with derivatives. The expansion generated by a function is not the form of polynomials, which is different from Taylors expansion. Some application examples are given to show that the region of convergence is much larger than that obtained by Taylor's Series. As a kind of application, a new learning algorithm based on the new expansion for training neural networks is also proposed. The weight functions using the new expansion are more suitable for training those patterns obtained by some rational problems.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

761-764

Citation:

Online since:

October 2013

Authors:

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] D. Y. Zhang: New theories and methods on neural networks. Beijing: Tsinghua University Press, 2006 (in Chinese).

Google Scholar

[2] D. Y. Zhang: Systems Engineering and Electronics (in Chinese). Vol. 28 (2006), p.1434.

Google Scholar

[3] D. Y. Zhang: Systems Engineering and Electronics (in Chinese). Vol. 30 (2008), p.2274.

Google Scholar

[4] D. Y. Zhang: In: Proc. 2010 International Conference on Artificial Intelligence and Education, edited by C. Zhao, volume 2, IEEE Press, Hangzhou (2010).

Google Scholar

[5] D. Y. Zhang: Applied Mechanics and Materials, edited by Y. H. Kim and P. Yarlagadda, Part 2 of Advances in Mechatronics and Control Engineering, Vol. 278-280 (2013), p.1301.

Google Scholar

[6] D. Y. Zhang: Submitted to Annals of Mathematics (2013).

Google Scholar