Using the Interactive Design of Gesture Recognition in Augmented Reality

Article Preview

Abstract:

Due to the rapid development of computer hardware, the mobile computer systems such as PDAs, high-end mobile phones are capable of running augmented reality (AR, hereafter) system nowadays. The mouse and keyboard based user interfaces of the traditional AR system may not be suitable for the mobile AR system because of different hardware interface and use environment. The goal of this research is to propose a novel computer-vision based human-computer interaction model, which is expected to greatly improve usability of the mobile augmented reality. In this research, we will conduct an experiment on testing the usability of a new gesture-based interface and propose a product evaluation model for e-commerce applications based on the gesture interface. In the end, we expected the new interaction model could encourage more commercial applications and other research projects. In this paper, we propose a new interface interaction model called PinchAR. The focus of PinchAR is on adapting the interface design to the changing hardware design. This paper summarizes the PinchAR project, that is, the design of an intuitive interaction model in an AR environment. Also included in this paper are the results of the PinchAR experiments.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

185-190

Citation:

Online since:

February 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] R. T. Azuma, A Survey of Augmented Reality: Teleoperatore and Virtual Environment, vol. 6, no. 4 (1997) 355-385.

Google Scholar

[2] A. Henrysson, Mobile phone based AR scene assembly, Proc. 4th International Conference on Mobile and Ubiquitous Multimedia 154 (2005) 95-102.

DOI: 10.1145/1149488.1149504

Google Scholar

[3] M. Storring, T. B. Moeslund, Y. Liu, and E. Granum, Computer Vision-based Gesture Recognition for an Augmented Reality Interface, In 4th IASTED International Conference on Visualization, Imaging, and Image Processing (2004) 766-771.

Google Scholar

[4] J. Segen and S. Kumar, Video Acquired Gesture Interface for the Handicapped, Proc. the sixth ACM international conference on Multimedia: Face/gesture recognition and their applications, (1998) 45-48.

DOI: 10.1145/306668.306684

Google Scholar

[5] C.W. Ng and S. Ranganath, Real-time gesture recognition system and application, Image and Vision Computing 20 (2002) 993-1007.

DOI: 10.1016/s0262-8856(02)00113-0

Google Scholar