EEG Signal Classification by Global Field Power

Article Preview

Abstract:

Our project focuses on the emotional face evoked EEG signal recognition. Since EEG signals contain enough information to separate different emotional facial expressions. Thus we propose a new approach which is based on global field power on EEG signal classification. In order to perform this result, firstly, we gather a dataset with EEG signals. This is done by measuring EEG signals from people aged 20-30 that are stimulated by emotional facial expressions (Happy, Neutral, Sad). Secondly, the collected EEG signals are preprocessed through using noise reduction method. And then select features by principal component analysis (PCA) to filter out redundant information. Finally, using fisher classifier and a 10-fold cross validation method for training and testing, a good classification rate is achieved when combination local max global field power EEG signals. The rate is 90.49%.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

1434-1437

Citation:

Online since:

October 2011

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2012 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] Kapoor A, Shenoy Pand Tan D, in: Combining Brain Computer Interfaces with Vision for Object Categorization, CVPR, (2008).

DOI: 10.1109/cvpr.2008.4587618

Google Scholar

[2] Horlings R, Datcu D and Rothkrantz L, in: Emotion Recognition using Brain Activity, Proceedings of the 9th International Conference on Computer Systems and Technologies and Workshop for PhD Students in Computing, (2008).

DOI: 10.1145/1500879.1500888

Google Scholar

[3] Petrantonakis P C, Hadjileontiadis L J, in: Emotion Recognition From EEG Using Higher Order Crossings, in: Information Technology in Biomedicine, IEEE, 14(2), (2010).

DOI: 10.1109/titb.2009.2034649

Google Scholar

[4] Schaaff K and Schultz T, in: Towards an EEG-based Emotion Recognizer for Humanoid Robots, in: Robot and Human Interactive Communication, (2009).

DOI: 10.1109/roman.2009.5326306

Google Scholar

[5] Koelstra S, Muhl CandPatras I, in: EEG analysis for implicit tagging of video data, in: Affective Computing and Intelligent Interaction and Workshops, (2009).

DOI: 10.1109/acii.2009.5349482

Google Scholar

[6] Murugappan M, Nagarajan R and Yaacob S, in: Comparison of Different Wavelet Features from EEG Signals for Classifying Human Emotions, IEEE Symposium on Industrial Electronics and Applications, October 4-6, (2009).

DOI: 10.1109/isiea.2009.5356339

Google Scholar

[7] Meeren H K M., van Heijnsbergen C C R J and Gelder B de, in: Rapid perceptual integration of facial expression and emotional body language, Proceedings of the National Academy of Sciences of the United of America102(45), (2005).

DOI: 10.1073/pnas.0507650102

Google Scholar

[8] Delorme A, Sejnowski TandMakeig S, in: Enhanced detection of artifacts in EEG data using higher-order statistics and independent component analysis, the original article published in NeuroImage, (2007).

DOI: 10.1016/j.neuroimage.2006.11.004

Google Scholar