Delay-Dependent H Filtering for Neural Networks with Time Delay

Article Preview

Abstract:

The H{infinity} filter design problem of recurrent neural networks with time delay is considered. Based on delay decomposition approach, the delay-dependent condition is derived to ensure that the filtering error system is globally asymptotically stable with a guaranteed performance. And the design of such a filter can be solved by the linear matrix inequality. A numerical example is provided to demonstrate that the developed approach is efficient.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

875-879

Citation:

Online since:

February 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] S. Arik, Physics Letters, Section A, Vol. 311(2003), pp.504-511.

Google Scholar

[2] Y. He, G. Liu, D. Rees, and M. Wu, IEEE Transactions on Neural Networks, Vol. 18(2007), p.1850.

Google Scholar

[3] C. Song, H. Gao, and W. Zheng, , Neurocomputing, Vol. 72 (2009), p.2563.

Google Scholar

[4] H. Gao and C. Wang, IEEE Transactions Signal Process, Vol. 52( 2004), p.1631.

Google Scholar

[5] H. Gao, J. Lam, and C. Wang, System Control Letter, Vol. 55 (2006), p.101.

Google Scholar

[6] H. Gao, J. Lam, L. Xie, and C. Wang, IEEE Transaction. Signal Process, Vol. 53(2005), p.3183.

Google Scholar

[7] . Huang and G. Feng,. IEEE Transaction. Circuits System I, Vol. 56 (2009), p.846.

Google Scholar

[8] Q.L. Han, 7th World Congress on Intelligent Control and Automation(2008), p . 289.

Google Scholar