Controlling a Social Robot - Performing Nonverbal Communication through Facial Expressions

Article Preview

Abstract:

This paper focuses on our primary goal to achieve the emotional behaviour of the new version of the social robot Probo. The ability to enhance nonverbal communication with children is possible through facial expressions, eye-tracking and face-to-face contact. The new social robot has 21 degrees of freedom (DOF), grouped in five subsystems, named generically: eyes, ears, trunk, mouth and neck. The robotic head is actuated using only servo motors and all the components are manufactured using cheap, flexible and easy technologies. In order to get the social robot head able to express emotions, a Graphical User Interface (GUI) was developed. In this way facial expressions are created through sliders or push buttons. Additionally, we investigated the possibility of controlling the robot with an Arduino board. In this case, using pre-programmed or learned algorithms, the robot is getting a semiautonomous level, based on the usage of various sensors, being able to express six basic emotions: happiness, sadness, fear, anger, surprise and disgust. So, based on the feedback provided by the sensors, the robot can react accordingly, enhancing human-robot interaction (HRI).

You might also be interested in these eBooks

Info:

Periodical:

Pages:

525-530

Citation:

Online since:

November 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] S.S., Ge, C., Wang, C.C., Hang, A facial expression imitation system in human robot interaction, In The 17th International Symposium on Robot and Human Interactive Communication, (2008).

DOI: 10.1109/roman.2008.4600668

Google Scholar

[2] C.L., Breazeal, Designing Sociable Robots, MIT Press, Massachusetts, (2002).

Google Scholar

[3] C. L. Breazeal, Robotic Life - Sociable Robots, http: /robotic. media. mit. edu/projects/ Leonard/ Leo-learning. html, (2005).

Google Scholar

[4] S., Sosnowsky, A., Bittermann, K., Kuhnlenz, M., Buss, Design and Evaluation of Emotion-Display EDDIE, In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006, pp.3113-3118.

DOI: 10.1109/iros.2006.282330

Google Scholar

[5] A.J.N. Van Breemen, P. Res, and N. Eindhoven, Animation Engine for Believable Interactive User-interface Robots, In Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, volume 3, pp.2873-2878, (2004).

DOI: 10.1109/iros.2004.1389845

Google Scholar

[6] H.S. Lee, J.W. Park, M.J. Chung, A Linear Affect - Expression Space Model and Control Points for Mascot-Type Facial Robots, IEEE Transactions on Robotics, 23(5): 863-873, (2007).

DOI: 10.1109/tro.2007.907477

Google Scholar

[7] K. Goris, J. Saldien, B. Vanderborght, D. Lefeber, Mechanical design of the huggable robot Probo, International Journal of Humanoid Robotics. Volume 8, Issue 3 (2011) 481-511.

DOI: 10.1142/s0219843611002563

Google Scholar

[8] J.A., Russell, Affective space is bipolar, Journal of Personality and Social Psychology 37(3), 1979, pp.345-356.

Google Scholar

[9] P., Ekman, W.V., Friesen, P., Ellsworth, Does the face provide accurate information?, In P. Ekman (Ed. ), Emotion in the human face (2nd ed. ), Cambridge, UK: Cambridge University Press, 1982, pp.56-97.

DOI: 10.1016/b978-0-08-016643-8.50027-6

Google Scholar

[10] R., Plutchik, Emotion, a psychoevolutionary synthesis, Ed. Harber&Row, (1980).

Google Scholar

[11] L., Brethes, F., Lerasle, P., Danes, Data fusion for visual tracking dedicated to human-robot interaction, In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, (Barcelona, Spain), 2005, p.2075-(2080).

DOI: 10.1109/robot.2005.1570419

Google Scholar

[12] A., Mehrabian, A., Silent Messages. Belmont, CA: Wadsworth, ISBN 0-534-00910-7, (1971).

Google Scholar

[13] K. Berns, T. Braum, Design Concept of a Human-like Robot Head, In Proceedings of 2005 IEEE-RAS 5th International Conference on Humanoid Robots, (2005), pages 32-37.

DOI: 10.1109/ichr.2005.1573541

Google Scholar

[14] http: /www. mathworks. com/hardware-support/arduino-matlab. html.

Google Scholar

[15] J., Saldien, K., Goris, B. Vanderborght, J., Vanderfaeillie, D., Lefeber, Expressing Emotions with the Social Robot Probo, Int J Soc Robot (2010) 2: 377-389, Springer.

DOI: 10.1007/s12369-010-0067-6

Google Scholar