A Robotic Facial Expression Recognition System Using Real-Time Vision System

Article Preview

Abstract:

The capability of recognizing human facial expression plays an important role in advanced human-robot interaction development. Through recognizing facial expressions, a robot can interact with a user in a more natural and friendly manner. In this paper, we proposed a facial expression recognition system based on an embedded image processing platform to classify different facial expressions on-line in real time. A low-cost embedded vision system has been designed and realized for robotic applications using a CMOS image sensor and digital signal processor (DSP). The current design acquires thirty 640x480 image frames per second (30 fps). The proposed emotion recognition algorithm has been successfully implemented on the real-time vision system. Experimental results on a pet robot show that the robot can interact with a person in a responding manner. The developed image processing platform is effective for accelerating the recognition speed to 25 recognitions per second with an average on-line recognition rate of 74.4% for five facial expressions.

You might also be interested in these eBooks

Info:

Periodical:

Key Engineering Materials (Volumes 381-382)

Pages:

375-378

Citation:

Online since:

June 2008

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2008 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] M. Fujita: Proc. of the IEEE, Vol. 92 (2004), No. 11, pp.1804-1813.

Google Scholar

[2] P. Ekman and W. V. Friesen: The Facial Action Coding System (Consulting Psychologists Press, San Francisco 1978).

Google Scholar

[3] T. Wilhelm, H.J. Bohme, H.M. Grofi and A. Backhaus: Proc. of IEEE International Conference on Systems, Man and Cybernetics, The Hague, The Netherlands, Vol. 3 (2004), pp.2203-2208.

Google Scholar

[4] I. Buciu, C. Kotropoulos and I. Pitas: Proc. of 2003 International Conference on Image Processing, Barcelona, Spain, Vol. 3 (2003), pp.855-858.

Google Scholar

[5] J. Ye, Y. Zhan and S. Song: Proc. of IEEE International Conference on System, Man and Cybernetics, The Hague, The Netherlands, Vol. 3 (2004), pp.2215-2219.

Google Scholar

[6] I.O. Stathopoulou and G.A. Tsihrintzis: Proc. of IEEE International Conference on System, Man and Cybernetics, The Hague, The Netherlands, Vol. 1 (2004), pp.666-671.

Google Scholar

[7] P.K. Manglik, U. Misra, Prashant and H.B. Maringanti: Proc. of IEEE International Conference on System, Man and Cybernetics, The Hague, The Netherlands, Vol. 3 (2004), pp.2220-2224.

DOI: 10.1109/icsmc.2004.1400658

Google Scholar

[8] IC-MEDIA Corp.: ICM205B Datasheet, Oct (2002).

Google Scholar

[9] Henrry Andrian and Kai-Tai Song: Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, Alberta, Canada, 2005, pp.3694-3699.

DOI: 10.1109/iros.2005.1545544

Google Scholar

[10] http: /www. pololu. com/products/pololu/0727.

Google Scholar

[11] C.M. Chou: Real-Time Face Tracking Under Illumination Variation, M. Eng. Thesis, National Chiao Tung University, Hsinchu, Taiwan, R.O.C., (2005).

Google Scholar

[12] J.H. Lai, P. C Yuen, W.S. Chen, S. Lao and M. Kawade: Proc. of IEEE ICCV Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real-time Systems, Canada, 2001, pp.168-174. Happiness Anger Neutral Surprise Sadness.

DOI: 10.1109/ratfg.2001.938927

Google Scholar