Analysis of the Accuracy and Robustness of Kinect Sensor Used with the Scope of Robot Manipulation

Article Preview

Abstract:

This paper describes in details the evaluation procedure developed in this paper for the measurement of the Kinect sensors accuracy and robustness in the detection process of the user hand and recognizing human hand gestures. Furthermore, the results are transferred to a robotic gripper in virtual environment for visualization. The research started with consideration on the current state of the methods and sensors used for the detection of the hand gestures. It was seen that the detection of human hand and gestures recognition represent an important research study in the field of robotics controls and automation. Through the time, different methods [7, 13] and sensors [12] were developed for the action of detection which became, with the passing of time, better and better, reaching today, a point, when Kinect sensor was constructed as a combination between mature sensor technology and efficient algorithms for detection. The goal of this paper is represented by the analysis, if the XBox 360 (TM) Kinect sensor, developed by Microsoft, is accurate enough to be used as a mean to control a robotic hand in virtual environment. To find this out, the main objectives of this paper represents the measurement of the accuracy of Kinect sensor action for the capture of gestures performed by a human hand in a repeatable manner. As a second objective, in order to be able to visualize the result, the gestures are translated into virtual environment. In order achieve the main scope of this paper, two metrics were defined, according to ISO standard for measurement. Those are: Accuracy/Precision using ISO 5725: 1994 [16] Repeatability - ISO 21748: 2010 [17] The result of the captured hand posture with the highest accuracy are then moved to a robotic hand simulated in virtual environment executing an object grasping and releasing.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

170-177

Citation:

Online since:

June 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] G. Aleny`a, B. Dellen, C. Torras, 3D modelling of leaves from color and tof data for robotized plant measuring, In Proc. ICRA, 2011, p.3408–3414.

DOI: 10.1109/icra.2011.5980092

Google Scholar

[2] Q. Chen, D. Li, C.K. Tang, Compressive Structured Light for Recovering Inhomogeneous Participating Media, IEEE Transaction on Pattern analysis and machine intelligence, 2013, Vol. 35, No. 3.

DOI: 10.1109/tpami.2012.130

Google Scholar

[3] J.L. Cronbach, Coefficient alpha and the internal structure of tests, Psychometrika, 1951, Vol. 16, No. 3, p.297–334.

DOI: 10.1007/bf02310555

Google Scholar

[4] F.P.M. Elgendi, N. Magenant-Thalmann, Real-Time Speed Detection of Hand Gesture using Kinect, Workshop on Autonomous Social Robots and Virtual Humans, The 25th Annual Conference on Computer Animation and Social Agents (CASA 2012), Singapore (2012).

Google Scholar

[5] M. Guilhaus, Special feature: Tutorial Principles and Instrumentation in Time-of-flight Mass Spectrometry Physical and Instrumental Concepts, Journal of mass spectrometry, 1995, Vol. 30, pp.1519-1532.

DOI: 10.1002/jms.1190301102

Google Scholar

[6] M. Hansard, S. Lee, O. Choi, R. Horaud, Time-of-Flight Cameras: Principles, Methods and Applications, Springer briefs in computer science, (2012).

DOI: 10.1007/978-1-4471-4658-2_3

Google Scholar

[7] A.M. Khan, W. Umar, T. Choudhary, F. Hussain, M.H. Yousaf , A new algorithmic approach for fingers detection and identification, Proc. SPIE 8768, International Conference on Graphic and Image Processing, (2013).

DOI: 10.1117/12.2011115

Google Scholar

[8] M. Kvalbein, The use of a 3D sensor (Kinect™) for robot motion compensation, Master thesis, University of Oslo, Department of Informatics, Oslo, Norway (2012).

Google Scholar

[9] R. Lockton, Hand Gesture Recognition Using Computer Vision, 4th Year Project Report, Balliol College, Oxford University, Oxford, UK, (2002).

Google Scholar

[10] A. Miller, P.K. Allen, Graspit!: A Versatile Simulator for Robotic Grasping, IEEE Robotics and Automation Magazine 2004, Vol. 11, No. 4, pp.110-122.

DOI: 10.1109/mra.2004.1371616

Google Scholar

[11] K. Par, O. Tosun, Real-time Traffic Sign Recognition with Map Fusion on Multicore/Many-core Architectures, Acta Polytechnica Hungarica, Vol. 9, No. 2, (2012).

Google Scholar

[12] Z. Ren, J. Meng, J. Yuan, Z. Zhang , Robust hand gesture recognition with kinect sensor, Proceeding MM '11 Proceedings of the 19th ACM international conference on Multimedia, New York, USA, 2011, pp.759-760.

DOI: 10.1145/2072298.2072443

Google Scholar

[13] K. Wang, L. Xu, Y. Fang, J. Li , One-against-all frame differences based hand detection for human and mobile interaction, Neurocomputing. Image Feature Detection and Description 2013, Vol. 120, No. 23, p.185–191.

DOI: 10.1016/j.neucom.2012.06.057

Google Scholar

[14] S. Warade, J. Aghav, P. Claude, S. Udayagiri, Real Time Detection and Tracking with Kinect, Proceedings of the ICCIT, Bangkok, Thailand, 2012, pp.86-89.

Google Scholar

[15] Information on: www. waybeta. com/news/58230/microsoft-kinect-somatosensory-gamedevice-full-disassembly-report-_microsoft-xbox.

Google Scholar

[16] ISO Standard: ISO 5725: (1994).

Google Scholar

[17] ISO Standard: ISO 21748: (2010).

Google Scholar