Research on Natural Language Recognition Algorithm Based on Sample Entropy

Article Preview

Abstract:

Sample entropy can reflect the change of level of new information in signal sequence as well as the size of the new information. Based on the sample entropy as the features of speech classification, the paper firstly extract the sample entropy of mixed signal, mean and variance to calculate each signal sample entropy, finally uses the K mean clustering to recognize. The simulation results show that: the recognition rate can be increased to 89.2% based on sample entropy.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

185-189

Citation:

Online since:

July 2014

Authors:

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] Yan Nan, Wang Jue, Wei Na, et al. Feature exaction and classification of attention related electroencephalographic signals based on sample entropy [J]. Journal of Xipan Jiaotong University, 2007, 41 (10): 1237-1241 2006, 25 (5): 527-531.

Google Scholar

[2] Ei M M S, Moe P. An efficient approach for classification of speech and music[C]. In: Multimedia Information Processing, Berlin. 2008: 50-60.

Google Scholar

[3] Hu Guangrui, Wei Xiaodong. Endpoint detection of noisy speech based on cepstum [J]. Acta Electronica Sinica, 2000, 28 (10): 95-97.

Google Scholar

[4] Fan Yingle, Wu Chuanyan, Li Yi, et al. Application of C0 complexity measure in detecting speech [J]. Chinese Journalof Sensors and Actuators, 2006, 19 (3): 750-753.

Google Scholar

[5] Lei Xiongguo, Zeng Yicheng, Li Ling. Noisy speech endpoint detection based on approximate entropy [J]. Technical Acoustics, 2007, 26 (1): 121-125.

Google Scholar