Study of Vision-Based Hand Gesture Recognition System for Astronaut Virtual Training

Article Preview

Abstract:

With the fast development of vision-based hand gesture recognition, it is possible to apply the technology to astronaut virtual training. In order to solve problems of hand gesture recognition in future virtual training and to provide an unrestricted natural training for astronauts, this paper proposed a vision-based hand gesture recognition method, and implemented a hierarchical gesture recognition system to provide a gesture-driven interactive interface for astronaut virtual training system. The experiment results showed that this recognition system can be used to help astronaut training.

You might also be interested in these eBooks

Info:

Periodical:

Advanced Materials Research (Volumes 998-999)

Pages:

1062-1065

Citation:

Online since:

July 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] J.D. Smith, I.A. Twombly, C. Bruyns and R. Boyle. NASA Virtual GloveboX (VGX): Training and Simulation for Biology Research Aboard the International Space Station. Proceedings of AIAA Conference on International Space Station Utilization, 2001: 15-18.

DOI: 10.2514/6.2001-5105

Google Scholar

[2] A.O. Frank, I.A. Twombly, T.J. Barth and J.D. Smith. Finite element methods for real-time haptic feedback of soft-tissue models in virtual reality simulators. Proceedings of Virtual Reality, 2001. Proceedings. IEEE. IEEE, 2001: 257-263.

DOI: 10.1109/vr.2001.913794

Google Scholar

[3] B. Barnes, A.S. Menon, R. Mills, et al. Virtual reality extensions into surgical training and teleoperation. Proceedings of 4th International IEEE EMBS Special Topic Conference on. IEEE, 2003: 142-145.

DOI: 10.1109/itab.2003.1222493

Google Scholar

[4] Y.Q. Liu, S.G. Chen, G.H. Jiang, et al. VR Simulation System for EVA Astronaut Training. Proceedings of AIAA Conference on International Space Station Utilization, (2010).

DOI: 10.2514/6.2010-8696

Google Scholar

[5] W. Li, L. Wu, C. Liu. Research of hand gesture recognition in multi-touch projector-camera system. Advanced Materials Research, 2012, 588: 1184-1187.

DOI: 10.4028/www.scientific.net/amr.588-589.1184

Google Scholar

[6] A. Kurakin, Z. Zhang and Z. Liu. A real time system for dynamic hand gesture recognition with a depth sensor. Signal Processing Conference (EUSIPCO), 2012 Proceedings of the 20th European. IEEE, 2012: 1975-(1979).

Google Scholar

[7] X. Shen, G. Hua, Williams L and Y. Wu. Dynamic hand gesture recognition: An exemplar-based approach from motion divergence fields. Image and Vision Computing, 2012, 30(3): 227-235.

DOI: 10.1016/j.imavis.2011.11.003

Google Scholar

[8] J. Shotton, T. Sharp, A. Kipman, et al. Real-time human pose recognition in parts from single depth images. Communications of the ACM, 2013, 56(1): 116-124.

DOI: 10.1145/2398356.2398381

Google Scholar

[9] Z. Li, R. and Jarvis. Real time hand gesture recognition using a range camera. Proceedings of Australasian Conference on Robotics and Automation, 2009: 21-27.

Google Scholar

[10] L. Ding and A. Goshtasby. On the Canny edge detector. Pattern Recognition, 2001, 34(3): 721-725.

DOI: 10.1016/s0031-3203(00)00023-6

Google Scholar

[11] Z. Ren, J. Yuan and Z. Zhang. Robust hand gesture recognition based on finger-earth mover's distance with a commodity depth camera. Proceedings of the 19th ACM international conference on Multimedia ACM, 2011: 1093-1096.

DOI: 10.1145/2072298.2071946

Google Scholar

[12] E. Keogh, L. Wei, X. Xi, S.H. Lee and M. Vlachos. LB_Keogh supports exact indexing of shapes under rotation invariance with arbitrary representations and distance measures. Proceedings of the 32nd international conference on Very large data bases, 2006: 882-893.

Google Scholar