Design of Robot Head for Expression of Human Emotion

Humanoid robot is a type of robot which designed in human-form with the purpose to increase the quality of human life. The key features of humanoid robot are to perform human-like behaviours and to undergo effective interaction with human-operator. Facial expressions play an important role in natural human-robot communication as human communication in daily life relies on face-to-face communication. The purpose of this study was to develop an interactive robot head that able to express six basic human emotions based on Ekman’s model which are joy, sadness, anger, disgust, surprise and fear. The combination of action units based on different control point on robot head was proposed in this study. The new robot head provided with 11-DoFs to perform different expression in human-like way. A survey was conducted on twelve sets of emotion design drawn by using Solidworks. Evaluation had been done on each design for its expression ability and the best design of emotion to implement on the robot head was obtained in the end of survey. Hardware experiment was conducted to control the LCD display and position of servo motor by using Arduino Leonardo as the controller for the robot head system. Additionally, a keypad controller was designed to control the expression of robot head based on the control from user. The controller is connected with LCD display to show the name of facial expression for the learning purpose of autism children. This project focus on the performance test of robot head in term of position accuracy for the 11 actuators used in robot head construction. The result shows that the relative position error for each robot head parts was less than 10% and thus robot head able to perform the emotion effectively. The survey on the recognition rate for each emotion expression was conducted individually to 100 respondents. The recognition rate obtained for the six emotions express by robot head was more than 70%.


Introduction
Currently in the research of humanoid robot, the design and construction of the robot head become one of the determinant factors that will affect the effectiveness of communication between human and a robot [1][2][3][4][5][6][7][8][9]. Different design of robot head portrait will produce different understandability for the user. Thus, to design an appearance concept of the robot head, we have to take consideration on two issues [1]; which are how to design a face that can convey understandable expression, and how to make a friendly face and will not let user react adversely to it.
Ekmans Facial Action Coding System (FACS) is a psychological knowledge introduced by Paul Ekman for generating facial expression on robot face [2]. Appropriate control regions are select to generate adequate facial expression. In FACS psychology, Action Units (AUs) represent the movement of the muscle and it was divided into 44 basic components. The action units are form by the combination of different control point and it is used for producing various facial expressions. The typical six basic expressions recognize in Ekman's model are joy, sadness, anger, disgust, surprise and fear.
In this project, there are 10 AUs and two fixed units used to generate the six basic facial expressions. In order to perform the motion of these 10 AUs, 17 control points was selected and its direction of movement empirically on the facial skin.
Kobian-R robot consist many degree of freedom for the movement of robot head which is 24-DoFs of movement [3]. The amount of degree of freedom for the movement represents the moveable range of the parts on robot head. However, it does not represent the effectiveness of the facial expression for each emotion. This can be seen where the emotion happiness, sadness and fear of the robot head "Flobi" in [1] has the higher recognition rate compared to Kobian-R robot although it has only 18-DoFs of movement.
G.Wu et al. discussed on the type of material to manufacture face skin structure of the robot head where it have high similarity to human skin [4]. The flexible face film on the face able to emphasize on the muscle movement and resulted in higher accuracy of facial expression. In comparison with the human-like skin structure applied in [5], SAYA robot head is cover with a soft urethane skin and attached together with McKibben type of pneumatic actuator provide a better movement of the skin as human muscle. This shows that the mechanisms to drive the skin structure also play an important role in order to perform a more realistic facial expression.
One of the obvious advantages of Flobi robot over other robot head design is that the appearance of the Flobi robot head is changeable depends on different situation [1]. Other than that, Flobi head appearance with a more comic-like human face provides an advantage of familiarity of human face and it able to provides more pleasant facial expression to be accept by human being during interaction.
K.Berns et al. developed UKL robot head which can obtained the trajectories of emotion space through the implementation of face detection system [6]. This system enables real time face tracking where the robot head able to response adequately towards the emotion given by user. In [7], the construction of the robot head was just covered for neck and eye region. The dimension and weight of real human head was taken into account in i-RoK construction. This paper provides the head dimension for male and female to consider in robot head.
This research project concentrate on the development of robot head design for a more friendly appearance. The appearance of the robot head is one of the main factor that influence the effectiveness of learning for the autism children. The robot head able to express six basic emotion (joy, sadness, anger, disgust, surprise and fear) under the control of users and the relative expression can be shown through LCD display. Fig. 1 shows the block diagram for the robot head expression control. The main input for the robot head system is from the user control through the IR remote keypad controller. The robot head express the different emotion required by the user and the LCD display shows the name of that particular expression. This whole process was control by Arduino Leonardo controller with the preset programming or database.

894
International Integrated Engineering Summit 2014 eyebrows, eyelids and lips are taken into consideration to avoid any mechanical constrain happens during construction of robot head. This robot head mechanism consists of 11-DoFs to express the facial expressions for 6 basic emotions. Manufacturing process is the most difficult part for the whole construction of robot head. The combination of each mechanism of the part on the robot head is important to perform a more understandable facial expression. Eyeballs. In order to perform a human-like eye movement, the robot eyes design in three degree of freedom for its motion. The ideal animatronics movement of the eyes was using 3-axis gimbals system as shown in Fig. 2. However, the gimbals system used in this project consists of only two axis of free movement which allows the eyes to move left and right, up and down simultaneously. Normally for human eyes, it can rotate slightly about the direction of gaze. But in this case, this DoF is neglected as the pitch and yaw axes are sufficient to cover the visual space. The eyeball of the robot has a diameter of 40mm and the distance separation between the two eyes is 60mm. The complete construction for one eye has weight of 250g including motors and the gimbals mechanism. Table 1 shows the details of mechanism for the eyeball motion. Eyelids. The eyelids consist only the upper eyelid. It has one DoF for the open and close motion. The up and down motion of the eyelid was control by a link mechanism using a RC micro motor. The linkage rotates dependently on motor rotation either in clockwise and anticlockwise direction to adjust the motion of eyelids. The arrows in the Fig. 3 indicate the direction of motion for the eyelid.
Eyebrows. This robot has four control points where it placed on each corner of the eyebrows. One control points was placed at the inner eyebrow and another one is located at the outer part of the eyebrow. Each control point for the eyebrow is driven directly by the servo motor. The eyebrow has a limited motion in x-axis. The outer and inner eyebrow has a vertical motion range of ± 40°. The eyebrows are molded with Septon. By using this flexible Septon film structure, the eyebrow able to move in its required area to form different shape due to its elasticity. The eyebrows have a length of 37.5mm and width of 7.5mm.
Mouth. This robot has 2-DoFs at the mouth (Z-axis motion at the centre of upper and lower lip). The lip changes it shaped and forms the expression through the combination of the upper lip and lower lip. The corner of the upper and lower lid consists of two fixed point to prevent the motion of lips at that parts. The control point is place at the centre of the upper lips and lower lips for it up and down motion. The servo motor was connected to the upper and lower lips through a short wire linkage which as shown in the Fig. 4. Control system. The control system of robot head consists of the Arduino Leonardo circuit board controller. The database to generate each expression is inserted to the Arduino controller during design phase. The robot facial expression is programmed and can be controlled by IR keypad controller. The keypad controller will send an in-line signal to Arduino controller and it will convert the motion signal into motion program. This motion language program are downloaded to the controller. The actuators are activated when the IR keypad controller was pressed with relative to its facial expression selection. There are total seven facial expressions can be select on the keypad controller which included neutral expression. The LCD display show the name of the expression when the related button was pressed for each expression.

Result and discussion
Servo motor selection accuracy test. Initially, there are three different model of servo motor available for uses, which are SG90 micro servo, HD1800A micro servo and C40R RC hobby servo. This experiment was conducted to test on the accuracy of these three models of servo motor. The objective of this experiment was to obtain the performance of the actuators and decide which model of actuator that able to provide a more reliable and accurate angle value for the motion of robot head in express the emotion. This experiment covered the test of angle from 0 to 180 degrees and it was repeated for three times in order to obtain its average angle value for each model of servo motor. The mean, error and relative percent error are calculated to analyze which type of servo motor have a better accuracy and more suitable for the use in robot head construction. The result is shown in Fig. 5.  Fig. 5, it is found that among this three model of servo motor, HD1800A micro servo have more precise angle motion especially in the range from 0° until 90°. It is also the most suitable actuators to use for the high angle motion which more than 90°. This can be applied at the eye part of robot head where it require angle motion of 120° from left to right. While for the SG90 micro servo, although it has slightly lower accuracy compared to HD1800A, however it still able to produce an accurate angle value in the range of 0° to 70°. This motor is suitable to implement at the lips part as the motion of lips is not more than 60° and it able to provide a high accuracy of robot head expression. The C40R RC hobby servo only produce accurate angle value at 60° and below.

896
International Integrated Engineering Summit 2014

Conclusion and future works
The design of the robot head that able to perform 6 basic expressions through the combination of 13 action units is presented in this paper. The robot head made up of eyes, nose, mouth and ear which are the major elements on a normal human head. It is also shown that in order to know the performance of servo motors for different actuation point, an experiment has been done. In the future, accuracy test for each parts (eyes, eyebrow and mouth) will be done. Also it is intended to integrate the robot head with the real humanoid robot and analyse the overall performance of the system.