Development of Stationary User Interface Using Head-Tracking

Abstract:

Article Preview

Recently, various user interfaces are developed. However the operation of user interface is very difficult for the physically handicapped persons who cannot move their hand. The stationary user interface we are proposing uses head tracking via a camera and a display. It is portable and can operate household appliances. It is also operated intuitively in head tracking.

Info:

Periodical:

Edited by:

Qiancheng Zhao

Pages:

711-716

DOI:

10.4028/www.scientific.net/AMM.103.711

Citation:

K. Miyata et al., "Development of Stationary User Interface Using Head-Tracking", Applied Mechanics and Materials, Vol. 103, pp. 711-716, 2012

Online since:

September 2011

Export:

Price:

$35.00

[1] Y. Kitazono, K. Ishida, L. Zhang and S. Serikawa, Proposal of Easily Operated Remote Control Using a WEB Camera, Proc. of SCIS & ISIS 2008, No. FR-C4-3, 2008, pp.1100-1104.

[2] K. Mitsui, H. Igaki, K. Takemura, M. Nakamura, and K. Matsumoto, Exploiting Eye Gaze Information for Operating Services in Home Network System, 2006 International Symposium on Ubiquitous Computing Systems , Seoul, Korea, 2006, pp.13-27.

DOI: 10.1007/11890348_2

[3] H. Igaki, M. Nakamura and K. Matsumoto, A Service-Oriented Framework for Networked Appliances to Achieve Appliance Interoperability and Evolution in Home Network System, Proc. of International Workshop on Principles of Software Evolution(IWPSE 2005), Lisbon, Portugal, 2005, pp.61-64.

DOI: 10.1109/iwpse.2005.2

[4] Masahide Nakamura, Akihiro Tanaka, Hiroshi Igaki, Haruaki Tamada, and Ken-ichiMatsumoto, Constructing Home Network Systems and Integrated Services Using Legacy Home Appliances and Web Services, International Journal of Web Services Research, Vol. 5, No. 1, 2008, pp.82-98.

DOI: 10.4018/jwsr.2008010105

[5] John P. Chin , Virginia A. Diehl , Kent L. Norman, Development of an instrument measuring user satisfaction of the human-computer interface, Proceedings of the SIGCHI conference on Human factors in computing systems, Washington, D.C., United States, 1988, pp.15-19.

DOI: 10.1145/57167.57203

[6] Hirokazu Kato and Mark Billinghurst ,Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System, IWAR, 1999, pp.85-94.

DOI: 10.1109/iwar.1999.803809

[7] Herbert Bay, Tinne Tuytelaars, and Luc Van Gool,SURF: Speeded Up Robust Features, Computer Vision – ECCV 2006, Lecture Notes in Computer Science, Volume 3951/2006, 2006, pp.404-417.

DOI: 10.1007/11744023_32

[8] Baker. S, Scharstein. D, Lewis. J, Roth. S, Black. M and Szeliski. R, A database and evaluation methodology for optical flow. In Proceedings of the IEEE international conference on computer vision, (2007).

DOI: 10.1109/iccv.2007.4408903

[9] Hitoshi Hayakawa and Tadashi Shibata, Block-Matching-Based Motion Field Generation Utilizing Directional Edge Displacement, Computers & Electrical Engineering, Volume 36, 2010, pp.617-625.

DOI: 10.1016/j.compeleceng.2008.11.017

[10] Yusuke Kameda, Atsushi Imiya and Naoya Ohnishi, A Convergence Proof for the Horn-Schunck Optical-Flow Computation Scheme Using Neighborhood Decomposition, Lecture Notes in Computer Science, Volume 4958/2008, 2008, pp.262-273.

DOI: 10.1007/978-3-540-78275-9_23

[11] S. Baker: Lucas-Kanade 20 Years On : A Unifying Framework, International Journal of Computer Vision, Vol. 56, No. 3, 2004, pp.221-255.

DOI: 10.1023/b:visi.0000011205.11775.fd

In order to see related information, you need to Login.