Development of a Contact Based Human Arm Motion Analysis System for Virtual Reality Applications

Article Preview

Abstract:

This paper presents the development of a system for analyzing and adapting the Human Arm motion for virtual reality applications. The proposed system consists of number of Flex sensors, Inertial Measurement Units and Embedded Data Acquisition System, to record the joint angles for deriving the kinematic states (position, velocity and acceleration) of different parts of the human arm. A flexible structure is used for holding the sensors on the human arm at the required position, without hindering the movement. The embedded circuit utilizes a 32-bit Microcontroller to process the data from various sensors and transmits digitized data to the central computer for computing the various kinematics parameters. The system has been tested against standard motion tracking device and is found to perform close to the reference device with an average error of 6% . Such a device can be used to simulate critical operations in medicine and industry and analyze performance during various tasks.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

2139-2144

Citation:

Online since:

July 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] Van den Bergh M. et al, (2011). Real-time 3D Hand Gesture Interaction with a Robot for Understanding Directions from Humans,. In IEEE International Symposium on Robot and Human Interactive Communication (Ro-Man) (357-362).

DOI: 10.1109/roman.2011.6005195

Google Scholar

[2] L. Goncalves et al Monocular Tracking of the Human Arm in 3D Proc. of 5th Int'l Conference on Computer Vision (ICCV), (1995).

Google Scholar

[3] Yi-Ru Chen, Cheng-Ming Huang, Li-Chen Fu, Visual Tracking of Human Head and Arms with a Single Camera, Proc. of 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.3416-3421, Taipei, Taiwan, Oct. (2010).

DOI: 10.1109/iros.2010.5650914

Google Scholar

[4] J. Aleotti, S. Caselli, Functional Principal Component Analysis for Recognition of Arm Gestures and Humanoid Imitation, International Journal of Humanoid Robotics, Volume 10, Issue 04, December (2013).

DOI: 10.1142/s0219843613500333

Google Scholar

[5] John Lin, Ying Wu, and Thomas S. Huang, ``Modeling the Constraints of Human Hand Motion, in Proc. of IEEE Human Motion Workshop (HUMO00), 2000, pp.121-126.

DOI: 10.1109/humo.2000.897381

Google Scholar

[6] A Lécuyer, F Lotte, RB Reilly, R Leeb, M Hirose, M Slater, Brain-computer interfaces, virtual reality, and videogames., IEEE Computer 41 (2010), 66-72.

DOI: 10.1109/mc.2008.410

Google Scholar

[7] T. Scott Saponas et al. Enabling Always-Available Input with Muscle-Computer Interfaces" Proceedings ACM Symposium on User Interface Software and Technology(Victoria, BC, October 4-7, 2009). UIST , 09, ACM, New York, NY, 167-176.

DOI: 10.1145/1622176.1622208

Google Scholar

[8] Ahsan M. R, Ibrahimy M. I et. al, Electromyography (EMG) Signal based Hand Gesture Recognition using Artificial Neural Network (ANN), 4th International Conference On Mechatronics (ICOM), 2011; 01/(2011).

DOI: 10.1109/icom.2011.5937135

Google Scholar