Emotion Expression
Biped Humanoid Robot
KOBIAN-RII

  1. Objective
  2. Hardware Overview
  3. Emotional Expression
  4. Demonstration Video
  5. Previous Studies
  6. Acknowledgment

1. Objective

This research is aiming at providing RT (Robot Technology) for ADL (Activities of Daily Living) support, and clarifying the influence and effectiveness of physicality and expressivity during the interaction between human beings and robots.

Fig. 1.1 shows the application of RT service supporting ADL in the existing real environment such as families, public facilities, etc. However, it is necessary to conduct the interaction with the robot intuitively for general users as they have no idea about professional knowledge of robotics. Consequently, we propose to provide service by the biped humanoid robot with the capability of emotional expression. That is because the possibility for general use without any adjustment in the real environment due to biped walking, and the similar emotional expression as human beings, make it quite easy for users to intuitively understand.

Fig. 1.1 RT supporting ADL.

The head robots of WE series had been developed in our laboratory since 1995. The humanoid robot WE-4RII(Fig. 1.2) endowed with upper body and the ability to express emotion was developed in 2004. Besides, we had been developing the humanoid robots of WABIAN series since 1996. In 2007 we developed WABIAN-2R(Fig. 1.3) with the dimension and proportions based on adult female. It was able to simulate human movements such as walking and dancing.

Furthermore, also in 2007 we developed KOBIAN which was an integration of WE's upper body capable of emotional expression and WABIAN's lower body capable of biped walking. In 2010, small motor controller modules which were used in development of new KOBIAN's head with many DoFs. In 2013, we have developed a KONIAN-RII with a head equipped with a mechanism that can express the cartoon marks.

Fig. 1.2 WE-4RII.

Fig. 1.3 WABIAN-2R.

2. Hardware Overview

In 2013, we have developed Emotion Expression Biped Humanoid Robot KOBIAN-RII. It has 68-DoFs and a lot of sensors which serve as sense organs. It also has the computer in its body which controls the motion, and batteries. The total weight is 62kg.

Fig. 2.1 KOBIAN-R (2011).

Fig. 2.2 DoF configuration.

 Degrees of Freedom (DoF)
 Head24
 Neck4
 Waist2
 Trunk1
 Arms7×2
 Hands4×2
 Legs6×2
 Total68
 Sensors  CMOS camera
 capacitor microphone
 force sensor
 gas sensor
 6-axis force/torque sensor
 Actuators  DC servo motor
 ultrasonic actuator
 Batteries  Li-ion battery

Related papers
[1] Nobutsuna Endo and Atsuo Takanishi, "Development of Whole-body Emotional Expression Humanoid Robot for ADL-assistive RT services," Journal of Robotics and Mechatronics Vol.23 No.6, Fuji press, December 20, 2011.

The head is equipped with 27-DoFs (8-DoFs for eyebrows, 8 DoFs for eye and eyelids, 7-DoFs for lip, and 1-DoF for jaw) to perform facial expression. The head size is as large as an adult Japanese female, and all mechanisms necessary to move head parts (eyebrows, eyelids, etc.) are placed in the head. To express cartoon expressions, KOBIAN-RII has Full color LED display and mechanisms. Thanks to these, KOBIAN-RII is able to display cartoon marks used in japanese comics. Those marks are useful to increase the recognition of the robot emotions by Japanese users.

Fig. 2.3 Isometrics.

Fig. 2.4 DoF configuration.

Fig. 2.5 Mechanism.

Fig. 2.6 Mechanism of cartoon expression.

Related papers
[1] T. Kishi, et al., “Development of Expressive Robotic Head for Bipedal Humanoid Robot," Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4584-4589, Vilamoura, Algarve, Portugal, October, 2012.
[2] T. Kishi et al., "Development of Expressive Robotic Head for Bipedal Humanoid Robot with Wide Moveable Range of Facial Parts and Facial Color," Proceedings of the 19th CISM-IFToMM Symposium on Robot Design, Dynamics and Control (ROMANSY2012), pp. 151-158, Paris, France, June, 2012.

(1) Eyeboow, Forehead

The eyebrow is molded with Septon produced by KURARAY CO.LTD There are four equally spaced points used to control the eyebrow. The control points are driven by the magnet through the cover.

The facial color is controlled using a blue EL (Electro Luminescence) sheet placed on the forehead under the cover.

Fig. 2.7 Mechanism of eyebrow.

Fig. 2.8 Motion of eyebrow.

(2) Eyelid

The eyelid consists of the two independent upper eyelid and the joined lower eyelid. Both sides of the upper eyelid have 1-DoF for opening and closing and 1-DoF for rotating. The lower eyelid has 1-DoF for opening and closing.

The rotating motion of the upper eyelid is done by a wire driven mechanism with an ultrasonic motor.

Fig. 2.9 Mechanism of eyelid.

Fig. 2.10 Motion of eyelid.

(3) Eye

The eyes of KOBIAN-RII have 3-DoFs (joined eyes pitch and individual eye yaw) driven by gimbal structure.

Fig. 2.11 Mechanism of eyes.

(4) Mouth

The lip is also molded with Septon produced by KURARAY CO.LTD. The lip changes its shape pulled from 5 directions. There is one fixed point on the cover of cheek, two active control points on the upper lip and two active control points on the lower lip. Active control points can move in Y and Z direction.

Fig. 2.12 Mechanism of mouth.

Fig. 2.13 Motion of mouth.

(5) Cartoon expression mechanisms(sheet)

Black lines and black wrinkles are exposed by moving flexible sheets behind the white cover. (Fig.2.14). The forehead is a 7-layer structure. It comprises of: a LED display sheet diffusing light, LED display for cartoon marks and EL sheet for complexion expression. These layers are sandwiched between outer and inner layer of the cover (Fig.2.15).

Fig. 2.14 Mechanism of cartoon expression

Fig. 2.15 Layer structure of forehead

(6) Cartoon expression mechanisms(LED display)

Five LED displays are implemented (Fig.2.16). Thanks to a flexible display board, it can be deformed along the outer cover of the face. In addition, thanks to the world's smallest class of full-color LED produced by ROHM, cartoon marks can be displayed in high resolution (Fig.2.17).

Fig. 2.16 Placement of LED display

Fig. 2.17 LED display

2.2 Neck

The joint between neck and head has pitch and yaw axis, while the joint between torso and neck has pitch and roll axis. Similarly as the WE-4RII, the up and down sides of head is equipped with pitch axes to enable the motion of sticking out and pulling neck when fixing the face orientation.

Fig. 2.18 Neck.

2.3 Arms

KOBIAN-RII has 7-DoFs anthropomorphic arms. The arm consists of a shoulder part (pitch, yaw and roll axis), an elbow part (pitch axis), and a wrist part (pitch, yaw and roll axis). Because of the above-mentioned configuration, the same as human beings, the robot is able to avoid obstacles by utilizing the redundant DoFs.

Fig. 2.19 Anthropomorphic arms.

2.4 Hands

Except for the grasping task, hand is generally used in communication with human being such as handshake or pointing. To support ADL by KOBIAN-RII, the feeling of touching the hand and safety is stressed on. In view of these concepts, we developed the hand WSH-1 RII (Waseda Soft Hand-No.1 Refined II) made of soft materials (Fig. 2.20). Furthermore, to shake hands with appropriate force, WSH-1 RII is endowed with 14 sheet-like force sensors. (Fig. 2.21)(Fig. 2.22)

Fig. 2.20 Hands(WSH-1RII).

Fig. 2.21 Sheet-like force sensor.

Fig. 2.22 Position of sensors.

The palm is made of polyurethane rubber and the metal pipe mimicking bones. The fingers are made of silicon rubber, and covered by skin made of polyurethane rubber (Fig. 2.23).

Fig. 2.23 Palm internal structure.

Related papers
[1] N. Endo et al., "Development of Anthropomorphic Soft Robotic Hand WSH-1RII," Proceedings of the 19th CISM-IFToMM Symposium on Robot Design, Dynamics and Control (ROMANSY2012), pp. 175-182, Paris, France, June, 2012.

(1) Finger Mechanism

As shown in Fig. 2.24, the bending of each finger is driven by the under actuated wire-driving mechanism. Besides, as shown in Fig. 2.25, the bending of the forefinger and middle finger, and the internal and external rotation of the thumb, is independently configured as 1-DoF, respectively. While bending of the thumb, ring finger and pinkie is actuated by wire When the wire is pulled, the fingers are bent into the shape of the object as shown in Fig. 2.26. The extension of each finger is done by elastic material attached to the inside of the finger..

Fig. 2.24 Finger mechanism.

Fig. 2.25 Hands DoF configuration.

Fig. 2.26 Following holding.

(2) Abduction/Adduction Mechanism

In order to grasp the spherical object, humans abduct the thumb opposite to the small finger, and in case of the cylindrical grasp, we adduct the thumb opposite to the middle finger. With the internal and external rotation of the thumb implemented by 1-DoF wire-driving system, the stable grasping of objects is achieved.(Fig. 2.23, Fig. 2.24)

Fig. 2.23 Abduction/adduction mechanism.

Fig. 2.24 Abduction/adduction.

2.5 System configuration

The system configuration of KOBIAN-RII is presented on Fig. 2.29. KOBIAN-RII's control system is a hybrid of a centralized control system in the body with a main PC (CPU: Pentium M 1.6GHz, RAM: 2GB, OS: QNX Neutrino 6.3.0) and a distributed control system in the head with the main PC and 7 new motor controller units.

Fig. 2.29 System configuration of KOBIAN-RII.

(1) Motor controller unit

We developed the motor controller unit composed of the controller module and the motor driver modules. Changing motor driver modules allows control of various components and decreases debugging time of the communication and control sub-systems. The motor controller can control 4 brushed DC or ultrasonic motors and read 8 ch extra analog sensors. It's dimensions are 30 x 46 x 18 mm. It is very small considering its various functions. Furthermore, KOBIAN-RII's head is equal to adult female's head size because the motor controller units are small.

Fig. 2.30 Motor controller unit

Fig. 2.31 Schematic view of motor controller unit

 CPU  STM32F103VG
 Motors  4 DC/ ultrasonic motors
 Voltage [V]  8〜52
 Control modes  Current/velocity/position
 Sensors [ch]  4 encoders
 4 photosensors
 4 current sensors
 4 A/D converters
 size [mm]  46 x 30 x 18
 weight [g]  20

Related papers
[1] T. Otani, et al., “Development of Distributed Control System and Modularized Motor Controller for Expressive Robotic Head," Proceedings of the 19th CISM-IFToMM Symposium on Robot Design, Dynamics and Control (ROMANSY2012), pp. 183-190, Paris, France, June, 2012.

3. Emotional Expression

We use the Six Basic Facial Expressions of Ekman in the robot's facial control, and have defined the seven facial patterns of "Happiness", "Anger", "Disgust", "Fear", "Sadness", "Surprise", and "Neutral" emotional expressions. KOBIAN-RII can express these with its whole body.

(a) Happiness (b) Fear
(c) Surprise (d) Saddness
(e) Anger (f) Disgust
(g) Neutral

Fig. 3.1 Emotion expression of KOBIAN,Facial expression of KOBIAN-RII

Movie of emotion expression

 『Emotion expression』
 (KOBIAN-R Head)
 (.MPEG 12[s] 4.40[MB])

 KOBIAN-R expresses emotions using facial expression.

 『Emotion expression』
 (KOBIAN whole-body)
 (.MPEG 113[s] 34.8[MB])

 KOBIAN expresses emotions using whole body.

Related papers
[1] M. Zecca et al., “Design of the humanoid robot KOBIAN - preliminary analysis of facial and whole body emotion expression capabilities-” Proceedings of the 8th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2008), pp. 487-492, Daejeon, S. Korea, December, 2008.

4. Research of KOBIAN

4.1 Visual tracking based on the vestibular-ocular reflex with biped walking humanoid robot (2010)

Personal robots will become more and more popular in the future and will be required to be actively collaborate and live with their human partners. These personal robots must recognize changing environment and must conduct adequate actions like humans. Object tracking can be said to be a fundamental function from the view point of environmental sensing and reflex reaction against it. We developed an object tracking motion algorithm by using upper body. Then, we integrated it with an online walking pattern generator and developed an object tracking biped locomotion algorithm. Finally, we conducted an experimental evaluation and confirmed its effectiveness.

Fig. 4.1 Visual tracking

Related papers
[1] N. Endo et al., “Integration of Emotion Expression and Visual Tracking Locomotion based on Vestibulo-Ocular Reflex," Proceedings of the 19th IEEE International Symposium on Robot and Human Interactive Communication, pp. 593-598, Viareggio, Italy, September, 2010.

4.2 Method to select a comfortable walk orbit(2013)

In order for a robot to be able to navigate unknown environments, it is necessary for it to actively scan the environment with its sensors. This is important both for minimizing perception uncertainty and the probability of collisions. We proposed a gazing strategy whose objective is to minimize the uncertainty of the 3D reconstruction of the environment along the planned trajectory of the robot. We also proposed a new 3D reconstruction approach for stereo camera systems which better integrates measurements through time. That is done by a novel formulation of occupancy grids which includes all information retrieved from stereo.

Fig. 4.2 Method to select a comfortable walk orbit

Related papers
[1] M. Brandao et al., “Active Gaze Strategy for Reducing Map Uncertainty along a Path," Proceedings of the 3rd IFToMM International Symposium on Robotics and Mechatronics (ISRM 2013), pp. 455-466, Singapore, October, 2013.
[2] M. Brandao et al., “Integrating the whole cost-curve of stereo into occupancy grids," Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013), pp. 4681-4686, Tokyo, Japan, November, 2013.

4.3 S of Facial expression generation(2012)

There is need of extending the rigid concept of patterns based on only 6 emotions. A Facial expression generation system applied to humanoid robot KOBIAN-R, based on an extension of Plutchik’s model of emotions and on polynomial classifiers, was developed. It can produce thousands of combinations of facial cues to represent expressions of composite emotions and communication acts, including asymmetrical expressions.

Fig. 4.3 Process of facial expression generation

Related papers
[1] G. Trovato et al., “Generation of Humanoid Robot's Facial Expressions for Context-Aware Communication," International Journal of Humanoid Robotics, Vol. 10, Issue 01, 23 pages, March, 2013.
[2] G. Trovato et al., “Development of Facial Expressions Generator for Emotion Expressive Humanoid Robot," Proceedings of the 2012 IEEE-RAS International Conference on Humanoid Robots, pp. 303-308, Osaka, Japan, November, 2012.
[3] G. Trovato et al., “Evaluation Study on Asymmetrical Facial Expressions Generation for Humanoid Robot," Proceedings of the 1st International Conference on Innovative Engineering Systems, pp. 129-134, Egypt, December, 2012.

4.4 Cultural differences of emotion recognition with cartoon marks (2012)

A cultural gap in recognition of facial expression was found. A further study on culture-dependent different generation of facial expression was done and the innovation of the use of Japanese comic symbols to display on the face was introduced. Symbols displayed on face are an additional channel of communication. They can enhance recognition rate of Japanese but they may not always work for Westerners.

Fig. 4.4 Facial expression with cartoon marks

Related papers
[1] G. Trovato et al., “A Cross-Cultural Study on Generation of Culture Dependent Facial Expressions of Humanoid Social Robot," Proceedings of the 4th International Conference on Social Robotics, pp. 35-44, Chengdu, China, October, 2012.
[2] G. Trovato et al., “Cross-Cultural Perspectives on Emotion Expressive Humanoid Head: Recognition of Facial Expressions and Symbols," International Journal of Social Robotics, Vol. 5, Issue 4, pp. 515-527, November, 2013.

4.5 Difference of impression in cross-cultural greeting using humanoid robot(2013)

How are robots perceived when speaking / using gestures that belong to a certain national culture?A greeting interaction between Egyptians and Japanese with two versions of KOBIAN (one Japanese, one Arabic speaking).Culture-dependent acceptance and discomfort were found: Egyptians prefer the “Arabic robot”, and report discomfort when interacting with the “Japanese robot”; the other way round for Japanese subjects.

Greeting of Japan

Greeting of Egypt

Fig. 4.5 Greeting KOBIAN

Related papers
[1] G. Trovato et al., “Cross-cultural study on human-robot greeting interaction: acceptance and discomfort by Egyptians and Japanese," Journal of Behavioral Robotics, 11pages, October, 2013.
[2] G. Trovato et. al, “Towards Culture-specific Robot Customisation: A Study on Greeting Interaction with Egyptians," Proceedings of the 22nd IEEE International Symposium on Robot and Human Interactive Communication, pp. 447-452, Gyeongju, Korea, August, 2013.

4.6 Dynamic emotional expression based on the mental model(2013)

This research is the implementation in a walking humanoid robot of a mental model, allowing the dynamical change of the emotional state of the robot based on external stimuli; the emotional state affects the robot decisions and behavior, and it is expressed with both facial and whole-body patterns. To evaluate the importance of the proposed system in the framework of human-robot interaction and communication, we conducted a survey by showing videos of the robot behaviors to subjects. The results show that the integration of dynamical emotion expression and locomotion makes the humanoid robot more appealing to humans, as it is perceived as more "favorable" and "useful", and less "robot-like."

Front

Over view

Fig. 4.6 Emotional expression during visual tracking(surprise)

Movies of emotion expression during visual tracking

 『Emotion expression during visual tracking』
 (.MPEG 16[s] 9.20[MB])

 KOBIAN expresses emotion during visual tracking

Related papers
[1] T. kishi et al., “Impression Survey of the Emotion Expression Humanoid Robot with Mental Model based Dynamic Emotions," Proceedings of the 2013 IEEE International Conference on Robotics and Automation, pp. 1655-1660, Karlsruhe, Germany, May, 2013.

4.7 Emotional walking (2013)

Emotions are not only expressed through facial expressions or voice (tone and speech). Nonverbal behavior such as gait influence greatly the perception of emotions. For our research we captured motions with different emotions and emotional intensities of professional actors and non-actors subjects. We extracted of emotional gait parameters from the captured data and we modeled those parameters to work our Emotion Mental Model. We assessed both in simulation and with the real robot. Subject recognized the expressed emotions quite well and appreciated the emotionalism of the robot.

Happiness walking

Sadness walking

Fig. 4.7 Emotinal walking

Related papers
[1] M. Destephe et al., “Emotional Gait Generation Method based on Emotion Mental Model - Preliminary experiment with Happiness and Sadness -," Proceedings of the 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI2013), pp. 86-89, Jeju, Korea, October, 2013.
[2] M. Destephe et al., “Conveying Emotion Intensity with Bio-inspired Expressive Walking -Experiments with Sadness and Happiness-," Proceedings of the 22nd IEEE International Symposium on Robot and Human Interactive Communication, pp. 161-166, Gyeongju, Korea, August, 2013.
[3] M. Destephe et al., “The Influences of Emotional Intensity for Happiness and Sadness on Walking," Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 7452-7455, Osaka, Japan, July, 2013.
[4] M. Destephe et al., "Improving the Human-Robot Interaction through Emotive Movements - A Special Case: Walking -," Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction, pp. 115-116, Tokyo, Japan, March, 2013.

4.8 Laughter induction of biped humanoid robot (2013)

For human-robot interaction, it is required to interact with each other with an idea about our partner's psychological state (theory of mind). This research objective is to influence the robot’s human partner by making him or her laugh with KOBIAN. Experimental results show the realization of human laughter by KOBIAN. In addition, the Decrease of "Depression Dejection" and "Anger Hostility" (POMS parameters) in the subject's state of mind has been confirmed.

Fig. 4.8 A short skit of KOBIAN

A skit movies of KOBIAN

 『Skit Movies』
 (.MPEG 60[s] 74[MB])

 KOBIAN plays a skit

5. Previous Studies

KOBIAN/HABIAN (2009)
  • Whole body emotional expression robot KOBIAN was developed.
  • Emotional expression recognition rates of WE-4RII and KOBIAN were compared.
  • Emotional expression recognition rates with whole body expression was checked.
  • Hands with soft material WSH-1 and WSH-1R was developed
  • Emotional expression robot on wheels HABIAN was developed
    and emotional expression recognition rate was compared between on foot and on wheels.
  • Detail

All activity

6. Acknowledgment

This work is conducted under the strategic project of developing advanced robotics technology (U.Research project on service robot<2> communication RT system for elders) of NEDO.

We gratefully acknowledge the support for this work from the "Humanoid Research Institute (HRI)" at Waseda University, RoboCasa, the Global COE Program (Global・Robot・Academia), RoboSoM project from the European FP7 program (ICT-2009-4).

Finally, we also thanks to the SolidWorks Japan K.K.tmsuk Co., Ltd.DYDEN CorporationNikkiFron Co.Kuraray Co, LtdChukoh Chemical IndustriesSTMicroelectronics Co.Waseda Research Institute for Science and Engineering.

Humanoid Robotics Institute, Waseda University
WABOT-HOUSE Laboratory, Waseda University
The New Energy and Industrial Technology Development Organization (NEDO)
SolidWorks Japan K.K.
tmsuk Co., Ltd.
DYDEN Corporation
NikkiFron Co.
Kuraray Co, Ltd
Chukoh Chemical Industries
STMicroelectronics Co.
Back
Last Update: 2011-11-11
Copyright(C) 2011 Takanishi Laboratory
All Rights Reserved.