Development of an Interface Based on Mouth Open/Close Motions

Article Preview

Abstract:

This paper suggests a new type interface using the user’s mouth open/close motions. The user’s face was observed by the USB camera and the changes in the darkness on the mouth image were utilized as an interface. In order to improve the procedure speed, the only image around the mouth was used for image processing. This focused image area could be moved according to the movement of the user’s face, based on the displacement of the nostril with respect to the original position. Moreover, the developed interface was applied to the operations of the meal support manipulator. In the test operation, the developed interface could be successfully used after a few times practices.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

228-232

Citation:

Online since:

October 2010

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2010 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] T. Ochiai et al., Data Input Device for Physically Handicapped People Operated by Eye Movements, Transactions of the Japan Society of Mechanical Engineers, Series C, Vol. 63, No. 609 (1997), pp.1546-1550.

DOI: 10.1299/kikaic.63.1546

Google Scholar

[2] T. Tsuji et al., An EMG Controlled Pointing Device Using a Neural Network, Transactions of the Society of Instrument and Control Engineers, Vol. 37, No. 5 (2001), pp.425-433.

DOI: 10.9746/sicetr1965.37.425

Google Scholar

[3] T. Hatakeyama et al., Pointing Device Incorporating the Automatic Location Compensation Feature for People with Severe Physical Disability, Journal of Human Interface Society, Vol. 1, No. 3 (1999), pp.13-20.

Google Scholar

[4] N. Nakazawa et al., Development of Welfare Support-Equipment for Personal Computer Operation with Head Tilting and Breathing, Proceedings of the 31st Annual Conference of the IEEE Industrial Electronics Society, (2005-11), pp.201-206.

DOI: 10.1109/iecon.2005.1568904

Google Scholar

[5] M. Hashimoto et al., Application of the Teeth Chattering Sound and Humming to the Alternative Switch of a Pointing Device, Transactions of the Institute of Electronics, Information and Communication Engineers, Vol. J84-D-1, No. 7 (2001).

Google Scholar

[6] Y. Ichinose et al., Human Interface Using a Wireless Tongue-Palate Contact Pressure Sensor System and Its Application to the Control of an Electric Wheelchair, Transactions of the Institute of Electronics, Information and Communication Engineers, Vol. J86-D-II, No. 2 (2003).

Google Scholar

[7] J. SunJu et al., Vision based interface system for hands free control of an intelligent wheelchair, Journal of NeuroEngineering and Rehabilitation, Vol. 6, No. 33(2009), pp.1-17.

Google Scholar