Static Gesture Recognition Based on RGB-D Depth Information

Article Preview

Abstract:

This paper presents a method of static gesture recognition based on RGB-D depth image technology for problems of static gesture recognition under complex background.First,Microsoft's Kincet camera is for data collection.The hand region is extracted from complex background via depth image. Then appearance features are integrated to build the decision tree model which based on hands and largest index angle of the fingertips for hand gesture recognition.Nine common gestures with complex background were tested in the system.Experiments show that the method can implement efficiently,and has a strong robustness.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

248-254

Citation:

Online since:

October 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] Lee Jaemin, Hironori Takimoto, Hitoshi Yamauchi. A Robust Gesture Recognition based on Depth Data The 19th Korea-Japan Joint Workshop on Frontiers of Computer Vision. [S. 1. ]: [s. n], pp.127-131, (2013).

DOI: 10.1109/fcv.2013.6485474

Google Scholar

[2] Haitham Hasan, S. Abdul Kareem. Human Computer Interaction for Vision Based Hand Gesture Recognition: A Survey. International Conference on Advanced Computer Science Applications and Technologies. [S. 1. ]: IEEE Press, pp.55-60, 2013.

DOI: 10.1109/acsat.2012.37

Google Scholar

[3] Jesus Suarez, Robin R. Murphy. Hand Gesture Recognition with Depth Images: A Review. The 21st IEEE International Symposium on Robot and Human Interactive Communication. [S. 1. ]: [s. n], pp.411-417, (2012).

DOI: 10.1109/roman.2012.6343787

Google Scholar

[4] B. D. Zarit, B. J. Super and F. K. H. Quek. Comparison of five color models in skin pixel classification. Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, 1999. Proceedings. International Workshop on, pp.58-63, (1999).

DOI: 10.1109/ratfg.1999.799224

Google Scholar

[5] Guillaume Doisy, Verena Vanessa Hafner, Position-Invariant. Real-Time Gesture Recognition Based on Dynamic Time Warping. Proc. of HRI 2013 Proceedings. [S. 1. ]: IEEE Press, pp.87-88, (2013).

DOI: 10.1109/hri.2013.6483514

Google Scholar

[6] Valentino Frati, Domenico Prattichizzo. Using Kinect for hand tracking and rendering in wearable haptics. IEEE World Haptics. Istanbul, Turkey: [s. n], pp.317-321, (2011).

DOI: 10.1109/whc.2011.5945505

Google Scholar

[7] OpenKinect Organization. Imaging information for Kinect[EB/OL]. [2013-02-10]. http: /openkinect. org/wiki/Imaging_Information.

Google Scholar

[8] W Andrew, S Mubarak, D Vitoria. A virtual 3D blackboard: 3D finger tracking using a single camera. IEEE International Conference on Automatic Face and Gesture Recognition, pp.536-543, (2000).

DOI: 10.1109/afgr.2000.840686

Google Scholar

[9] Cem Keskin, Furkan Kirac. Randomized Decision Forests for Static and Dynamic Hand Shape Classification. Computer Vision and Pattern Recognition Workshops (CVPRW), 2012 IEEE Computer Society Conference on, pp.31-36, (2012).

DOI: 10.1109/cvprw.2012.6239183

Google Scholar