Emotion Expression
Biped Humanoid Robot
KOBIAN-RIII

  1. Objective
  2. Hardware Overview
  3. Research of KOBIAN
  4. Previous Studies
  5. Acknowledgment

1. Objective

This research aims to provide RT (Robot Technology) for ADL (Activities of Daily Living) support, as well as to clarify the influence and effectiveness of physicality and expressiveness during interactions between human beings and robots.

Fig. 1.1 shows the application of RT services supporting ADL in existing real environment such as households, public facilities, etc. However, as general users usually do not have professional knowledge about robotics, it is necessary to achieve intuitive interactions with the robots. Consequently, we propose to provide RT services through the use of biped humanoid robots with the capability of emotional expression, mainly for two reasons:

Fig. 1.1 RT supporting ADL

The head robots of the WE series had been developed in our laboratory since 1995. The humanoid robot WE-4RII(Fig. 1.2) endowed with upper body and the ability to express emotion was developed in 2004. On the other hand, we had been developing humanoid robots of the WABIAN series since 1996, and in 2007 WABIAN-2R(Fig. 1.3) was developed, having its dimensions and proportions based on an average Japanese adult female, and being able to simulate human movements such as walking and dancing.

Then, in that same year of 2007 we developed KOBIAN, which was an integration of WE's upper body capable of emotional expression and WABIAN's lower body capable of biped walking. In 2011, the development of small motor controller modules enabled the adding degrees of freedom (DoF) to the head section, leading to the development of a new KOBIAN's head with an increased capability to express emotions. In 2013, we developed KOBIAN-RII, an emotion expression robot with a head equipped with a mechanism able to express emotions using cartoon marks. In 2014, We have developed KOBIAN-RIII that is added 2-DOF on shoulder base and improved the speed and movable range of arms. And it was confirmed that they were valid to human's laughter induction.

Fig. 1.2 WE-4RII

Fig. 1.3 WABIAN-2R

2. Hardware Overview

In 2014, we developed the Emotion Expression Biped Humanoid Robot KOBIAN-RII. It has 68-DoFs and a lot of sensors which serve as sense organs for the robot. It also has the computer and and batteries mounted on its body, used to control the motions and to energize the robot, respectively. The total weight of KOBIAN-RIII is 67kg.

Fig. 2.1 KOBIAN-R (2011)

Fig. 2.2 DoF configuration.

 Degrees of Freedom (DoF)
 Head27
 Neck4
 Waist2
 Trunk1
 Arms6×2
 Legs6×2
 Total58
 Sensors  CMOS camera
 Capacitor microphone
 Force sensor
 Gas sensor
 6-axis force/torque sensor
 Actuators  DC servo motor
 Ultrasonic actuator
 Batteries  Li-ion battery
 Weight[kg] 67

Related papers
[1] Nobutsuna Endo and Atsuo Takanishi, "Development of Whole-body Emotional Expression Humanoid Robot for ADL-assistive RT services," Journal of Robotics and Mechatronics Vol.23 No.6, Fuji press, December 20, 2011.

The head is equipped with 27-DoFs (8-DoFs for eyebrows, 8 DoFs for eye and eyelids, 7-DoFs for lip, and 1-DoF for jaw) to perform facial expressions. The head size is as large as an adult Japanese female, and all mechanisms necessary to move the head parts (eyebrows, eyelids, etc.) are placed inside the head. To express emotions using cartoon marks, KOBIAN-RII uses five Full color LED displays. Thanks to these, KOBIAN-RII is able to display cartoon marks similar to those used in Japanese comics. Those marks were shown to be useful to increase the recognition of the robot emotions by Japanese users.

Fig. 2.3 Isometric projection.

Fig. 2.4 DoF configuration.

Fig. 2.5 Mechanism location.

Fig. 2.6 Mechanisms for cartoon mark expression.

関連論文
[1] T. Kishi, et al., “Development of Expressive Robotic Head for Bipedal Humanoid Robot," Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4584-4589, Vilamoura, Algarve, Portugal, October, 2012.
[2] T. Kishi et al., "Development of a comic mark based expressive robotic head adapted to Japanese cultural background", Proceedings of the 2014 IEEE Intelligent Robots and Systems(IROS 2014), pp. 2608-2613, Chicago, USA, September 2014.
[3] 岸他, “顔面各部の広い可動域および顔色により豊かな表情表現が可能な 2 足ヒューマノイドロボット頭部の開発,” 日本ロボット学会誌,Vol. 31,No. 4,pp. 106-116,2013年5月.

(1) Eyebrow, Forehead

The eyebrow is molded with Septon® produced by KURARAY CO.LTD There are four equally spaced points used to control each eyebrow. The control points are driven by magnets through the cover, placing these on the inner part of the cover.

The facial color is controlled using a blue EL (Electro Luminescence) sheet placed on the forehead under the cover.

Fig. 2.7 Mechanism of eyebrow.

Fig. 2.8 Motion of eyebrow.

(2) Eyelid

The eyelid consists of the upper eyelid and the lower eyelid, which are independent. The upper eyelid has 1-DoF for opening and closing on each side and 1-DoF for rotating, while the lower eyelid has also 1-DoF for opening and closing on each side.

The rotating motion of the upper eyelid is done by a wire driven mechanism with an ultrasonic motor.

Fig. 2.9 Mechanism of eyelid.

Fig. 2.10 Motion of eyelid.

(3) Eye

The eyes of KOBIAN-RII have 3-DoFs (joined eyes pitch and individual eye yaw) driven by a gimbal structure.

Fig. 2.11 Mechanism of eyes.

(4) Mouth

The lip is also molded with Septon® produced by KURARAY CO.LTD. The lip changes its shape pulled from 5 directions. There is one fixed point on the cover of cheek, two active control points on the upper lip and two active control points on the lower lip. Active control points can move in Y and Z directions.

Fig. 2.12 Mechanism of mouth.

Fig. 2.13 Motion of mouth.

(5) Cartoon expression mechanism (sheet)

Black lines and black wrinkles are exposed by moving flexible sheets behind the white cover (Fig.2.14). The forehead is a 7-layer structure, in which are included a LED display, a light diffusion sheet, a LED display for cartoon marks and an EL sheet for complexion expression. These layers are sandwiched between the outer and inner layers of the cover (Fig.2.15).

Fig. 2.14 Mechanism of cartoon expression.

Fig. 2.15 Layer structure of forehead.

(6) Cartoon expression mechanisms (LED display)

Five LED displays are implemented (Fig.2.16). Thanks to the use of a flexible display board, it can be deformed along the outer cover of the face. In addition, thanks to the world's smallest class of full-color LED, cartoon marks can be displayed in high resolution (Fig.2.17).

Fig. 2.16 Placement of LED displays.




Fig. 2.17 LED display.

2.2 Neck

The joint between the neck and the head has pitch and yaw axes, while the joint between the torso and the neck has pitch and roll axes. Similarly as the WE-4RII, the upper and lower extremes are the ones equipped with the pitch axes to enable the motion of sticking out and pulling in the neck when fixing the face orientation.

Fig. 2.18 Neck.

2.3 Arm

We developed arms that can reproduce extremely fast-moving and wide motions, as those expressed by human comedians. The speed of the arm motions were compared to KOBIAN, whose arms had a standard humanoid performance and speed. The results showed an average increase in arm motion speed of at least 10 times for each of the joints. From the evaluation experiment, we confirmed the effectiveness of the high-speed arm motions for human laughter induction. As there was a need to increase the power output for the arm, a mechanism for driving two motors in parallel, as well as a mechanism using a flexible shaft for weight reduction is employed. Each of the details is shown below.

Fig. 2.19 Arm.

Fig. 2.20 DoF configuration.

(1) High output with mechanism fordriving 2 motors in parallel

The shoulder joint Roll axis that supports the whole armrequired a large torque. By adopting the method for driving two small motors in parallel, we realized a mechanism that saves more space compared to driving it with a single large motor.

Fig. 2.20 Structure of the joint for driving two motors in parallel

(2) Weight saving of arms by flexible shaft

There was a need to reduce the weight of the joints of the upper arm Yaw and elbow Pitch. Of the component parts that make up the joint, we placed the heaviest motors in the trunk, and adopted a mechanism for transmitting power to the joint with flexible shafts, achieving much lighter arms. To suppress the deterioration of positioning accuracy caused by the torsion of the flexible shafts, we placed the reduction gear on the output side of the flexible shafts.

Fig. 2.22 フレキシブルシャフトを利用した関節の構造

フレキシブルシャフトの動作


2.4 System configuration

The system configuration of KOBIAN-RIII is presented on Fig. 2.23. KOBIAN-RII's control system is a hybrid of a centralized control system in the body with a main PC (CPU: Pentium M 1.6GHz, RAM: 2GB, OS: QNX Neutrino 6.3.0) and a distributed control system in the head with the main PC and 7 new motor controller units.

Fig. 2.23 System configuration

(1) Motor controller unit

We developed a motor controller unit composed of the controller module and the motor driver modules. Changing the motor driver modules allows control of various components and decreases debugging time of the communication and control sub-systems. The motor controller can control 4 brushed DC or ultrasonic motors and read analog sensors from 8 extra channels. Its dimensions are 30 x 46 x 18 mm. It is very small considering all its functions. Furthermore, KOBIAN-RIII's head is equal to adult female's head size because the motor controller units are small.

Fig. 2.24 Motor controller unit.

Fig. 2.25 Schematic view of motor controller unit.

CPU STM32F103VG
 Motors  4 DC/ ultrasonic motors
 Voltage [V] 8〜52
 Control modes  Current/velocity/position
 Sensors [ch]  4 encoders
 4 photosensors
 4 current sensors
 4 A/D converters
 Size [mm]  46 x 30 x 18
 Weight [g]  20

関連論文
[1] T. Otani, et al., “Development of Distributed Control System and Modularized Motor Controller for Expressive Robotic Head," Proceedings of the 19th CISM-IFToMM Symposium on Robot Design, Dynamics and Control (ROMANSY2012), pp. 183-190, Paris, France, June, 2012.


3. Researches of KOBIAN

3.1 Research of making laugh

We are focusing on laugher as a good example of human-robot interaction,because laughter is an involuntary behavior that is possible to detect physiologically, and is said to have health benefits. So far, we have been working on a study to induce laughter in humans by whole body motions by KOBIAN.

(1) Laughter induction of biped humanoid robot (2013)

For human-robot interaction, it is required to interact with each other with an idea about our partner's psychological state (theory of mind). This research objective is to influence the robot's human partner by making him or her laugh with KOBIAN. Experimental results show the realization of human laughter by KOBIAN. In addition, the decrease of "Depression Dejection" and "Anger Hostility" (POMS parameters) in the subject's state of mind has been confirmed.

Fig. 3.1 A short skit of KOBIAN

Skits movies of KOBIAN.

 KOBIAN plays a skit

Related papers
[1] Kishi et al.,“Bipedal humanoid robot that makes humans laugh with use of the method of comedy and affects their psychological state actively ”,Proceedings of the 2014 IEEE International Conference on Robotics and Automation, pp. 1965-1970, Hong Kong, May 2014-June 2014.

(2) Effect of speed and movable range of arm to provoke human laugh (2014)

We examined the impact of the speed and range of movement of the arm of the robot on the funniness of motions to induce human-laughter. The results of these evaluation experiments showed that faster and wider motion of the arm makes increases the funniness.

Moving of KOBIAN-RIII

KOBIAN-RIII can move arms with high speed.

Related papers
[1]下村他,“笑いを通じた人間とロボットのインタラクションに関する研究(第1報:高速度・広可動域を実現した2足ヒューマノイドロボット上腕部の開発)”,第15回計測自動制御学会システムインテグレーション部門講演会,3H2-3,東京都,2014年12月.

(3) Development of 1-DOF robot hand that rub the side

There are not only visual ways to make people laugh, but also tactile ways such as tickling. Mechanism for making people laugh by tickling have many unexplained parts. So we developed a robot that rub the side of the torso to tickle, which is a reproducible way of tickling. From the evaluation experiments, we we able to make humans laugh using this tickling robot,.

Fig. 3.2 Tickling robot.


Fig. 3.3 Mechanism of tickleing robot.


Moving of tickling robot

Related papers
[1]岸他,“揉み動作によるくすぐりで人間の笑いを誘発する一自由度ロボットハンドの開発”,第32回日本ロボット学会学術講演会,3E2-05,福岡県,2014年9月.

3.2 Reasearches about emotion

(1) Emotional Expression

We use the Six Basic Facial Expressions of Ekman in the robot's facial control, and have defined the seven facial patterns of "Happiness", "Anger", "Disgust", "Fear", "Sadness", "Surprise", and "Neutral" emotional expressions. KOBIAN-RII can express these with its whole body.

(a) Happiness (b) Fear
(c) Surprise (d) Sadness
(e) Anger (f) Disgust
(g) Neutral

Fig. 3.1 Emotion expression of KOBIAN,Facial expression of KOBIAN-RII.

Movie of emotion expression


 KOBIAN-R expresses emotion using facial expression.



KOBIAN expresses emotions using whole body.


Related papers
[1] M. Zecca et al., “Design of the humanoid robot KOBIAN - preliminary analysis of facial and whole body emotion expression capabilities-” Proceedings of the 8th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2008), pp. 487-492, Daejeon, S. Korea, December, 2008.

(2) System of Facial expression generation(2012)

There is a need of extending the rigid concept of patterns based on only 6 emotions. A Facial expression generation system applied to humanoid robot KOBIAN-R, based on an extension of Plutchik’s model of emotions and on polynomial classifiers, was developed. It can produce thousands of combinations of facial cues to represent expressions of composite emotions and communication acts, including asymmetrical expressions.

Fig. 3.5 Process of facial expression generation.

Related papers
[1] G. Trovato et al., “Generation of Humanoid Robot's Facial Expressions for Context-Aware Communication," International Journal of Humanoid Robotics, Vol. 10, Issue 01, 23 pages, March, 2013.
[2] G. Trovato et al., “Development of Facial Expressions Generator for Emotion Expressive Humanoid Robot," Proceedings of the 2012 IEEE-RAS International Conference on Humanoid Robots, pp. 303-308, Osaka, Japan, November, 2012.
[3] G. Trovato et al., “Evaluation Study on Asymmetrical Facial Expressions Generation for Humanoid Robot," Proceedings of the 1st International Conference on Innovative Engineering Systems, pp. 129-134, Egypt, December, 2012.

(3) Cultural differences of emotion recognition with cartoon marks(2012)

A cultural gap in recognition of facial expression was found. A further study on culture-dependent different generation of facial expression was done and the innovation of the use of Japanese comic symbols to display on the face was introduced. Symbols displayed on face are an additional channel of communication. They can enhance recognition rate of Japanese but they may not always work for Westerners.

Fig. 3.6 漫符を用いた表情

Related papers
[1] G. Trovato et al., “A Cross-Cultural Study on Generation of Culture Dependent Facial Expressions of Humanoid Social Robot," Proceedings of the 4th International Conference on Social Robotics, pp. 35-44, Chengdu, China, October, 2012.
[2] G. Trovato et al., “Cross-Cultural Perspectives on Emotion Expressive Humanoid Head: Recognition of Facial Expressions and Symbols," International Journal of Social Robotics, Vol. 5, Issue 4, pp. 515-527, November, 2013.

(4) Emotional walking (2013)

Emotions are not only expressed through facial expressions or voice (tone and speech). Nonverbal behavior such as gait influence greatly the perception of emotions. For our research we captured motions with different emotions and emotional intensities of professional actors and non-actors subjects. We extracted emotional gait parameters from the captured data and we modeled those parameters to work our Emotion Mental Model. We assessed both in simulation and with the real robot. Subject recognized the expressed emotions quite well and appreciated the emotionalism of the robot.

Happiness walking.

Sadness walking.

Fig. 3.7 Emotional walking.

Related papers
[1] M. Destephe et al., “Emotional Gait Generation Method based on Emotion Mental Model - Preliminary experiment with Happiness and Sadness -," Proceedings of the 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI2013), pp. 86-89, Jeju, Korea, October, 2013.
[2] M. Destephe et al., “Conveying Emotion Intensity with Bio-inspired Expressive Walking -Experiments with Sadness and Happiness-," Proceedings of the 22nd IEEE International Symposium on Robot and Human Interactive Communication, pp. 161-166, Gyeongju, Korea, August, 2013.
[3] M. Destephe et al., “The Influences of Emotional Intensity for Happiness and Sadness on Walking," Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 7452-7455, Osaka, Japan, July, 2013.
[4] M. Destephe et al., "Improving the Human-Robot Interaction through Emotive Movements - A Special Case: Walking -," Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction, pp. 115-116, Tokyo, Japan, March, 2013.

(5) Difference of impression in cross-cultural greeting using humanoid robot(2013)

How are robots perceived when speaking / using gestures that belong to a certain national culture? A greeting interaction between Egyptians and Japanese with two versions of KOBIAN (one Japanese, one Arabic speaking). Culture-dependent acceptance and discomfort were found: Egyptians prefer the “Arabic robot”, and report discomfort when interacting with the “Japanese robot”; the other way round for Japanese subjects.

Greeting of Japan

Greeting of Egypt

Fig. 3.8 Greeting of KOBIAN

Related papers
[1] G. Trovato et al., “Cross-cultural study on human-robot greeting interaction: acceptance and discomfort by Egyptians and Japanese," Journal of Behavioral Robotics, 11pages, October, 2013.
[2] G. Trovato et. al, “Towards Culture-specific Robot Customisation: A Study on Greeting Interaction with Egyptians," Proceedings of the 22nd IEEE International Symposium on Robot and Human Interactive Communication, pp. 447-452, Gyeongju, Korea, August, 2013.

(6) Dynamic emotional expression based on the mental model(2013)

This research is the implementation in a walking humanoid robot of a mental model, allowing the dynamical change of the emotional state of the robot based on external stimuli; the emotional state affects the robot decisions and behavior, and it is expressed with both facial and whole-body patterns. To evaluate the importance of the proposed system in the framework of human-robot interaction and communication, we conducted a survey by showing videos of the robot behaviors to subjects. The results show that the integration of dynamical emotion expression and locomotion makes the humanoid robot more appealing to humans, as it is perceived as more "favorable" and "useful", and less "robot-like."

Front

Over view

Fig. 3.9 Emotional expression during visual tracking (surprise).

Movies of emotion expression during visual tracking


 KOBIAN expresses emotion during visual tracking.


Related papers
[1] T. kishi et al., “Impression Survey of the Emotion Expression Humanoid Robot with Mental Model based Dynamic Emotions," Proceedings of the 2013 IEEE International Conference on Robotics and Automation, pp. 1655-1660, Karlsruhe, Germany, May, 2013.

3.3 Research of Visual Sensor

(1)Visual tracking based on the vestibulo-ocular reflex with biped walking humanoid robot(2010)

Personal robots will become more and more popular in the future and will be required to actively collaborate and live with their human partners. These personal robots must recognize changing environments and must conduct adequate actions like humans. Object tracking can be said to be a fundamental function from the view point of environmental sensing and reflex reaction against it. We developed an object tracking motion algorithm by using the upper body. Then, we integrated it with an online walking pattern generator and developed an object tracking biped locomotion algorithm. Finally, we conducted an experimental evaluation and confirmed its effectiveness.

Fig. 3.10 Visual tracking.

Related papers
[1] N. Endo et al., “Integration of Emotion Expression and Visual Tracking Locomotion based on Vestibulo-Ocular Reflex," Proceedings of the 19th IEEE International Symposium on Robot and Human Interactive Communication, pp. 593-598, Viareggio, Italy, September, 2010.

(2)Method to select a comfortable walk orbit(2013)

In order for a robot to be able to navigate unknown environments, it is necessary for it to actively scan the environment with its sensors. This is important both for minimizing perception uncertainty and the probability of collisions. We proposed a gazing strategy whose objective is to minimize the uncertainty of the 3D reconstruction of the environment along the planned trajectory of the robot. We also proposed a new 3D reconstruction approach for stereo camera systems which better integrates measurements through time. That is done by a novel formulation of occupancy grids which includes all information retrieved from stereo.

Fig. 3.11 Method to select a comfortable walk orbit

Related papers
[1] M. Brandao et al., “Active Gaze Strategy for Reducing Map Uncertainty along a Path," Proceedings of the 3rd IFToMM International Symposium on Robotics and Mechatronics (ISRM 2013), pp. 455-466, Singapore, October, 2013.
[2] M. Brandao et al., “Integrating the whole cost-curve of stereo into occupancy grids," Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013), pp. 4681-4686, Tokyo, Japan, November, 2013.

4. Previous Studies

KOBIAN/HABIAN (2009)
  • Whole body emotional expression robot KOBIAN was developed.
  • Emotional expression recognition rates of WE-4RII and KOBIAN were compared.
  • Emotional expression recognition rates with whole body expression were checked.
  • Hands with soft material WSH-1 and WSH-1R were developed
  • Emotional expression robot on wheels HABIAN was developed
    and emotional expression recognition rate was compared between on foot and on wheels.
  • Details

All activity

5. Acknowledgment

This work is conducted under the strategic project of developing advanced robotics technology (U.Research project on service robot<2> communication RT system for elders) of NEDO.

We gratefully acknowledge the support for this work from the "Humanoid Research Institute (HRI)" at Waseda University, RoboCasa, the Global COE Program (Global・Robot・Academia), RoboSoM project from the European FP7 program (ICT-2009-4).

Finally, we also thanks to the SolidWorks Japan K.K.tmsuk Co., Ltd.DYDEN CorporationNikkiFron Co.Kuraray Co, LtdChukoh Chemical IndustriesSTMicroelectronics Co.Waseda Research Institute for Science and Engineering.

Humanoid Robotics Institute, Waseda University
WABOT-HOUSE Laboratory, Waseda University
SolidWorks Japan K.K.
The New Energy and Industrial Technology Development Organization (NEDO)
tmsuk Co., Ltd.
DYDEN Corporation
NikkiFron Co.
Kuraray Co, Ltd
Chukoh Chemical Industries
STMicroelectronics Co.
Back
Last Update: 2015-6-24
Copyright(C) 2013 Takanishi Laboratory
All Rights Reserved.