Whole Body Emotion Expression
Bipedal Humanoid Robot
KOBIAN-RIV

  1. Research Objective
  2. Hardware Overview
  3. Research of human-robot interaction with KOBIAN
  4. Our Previous Hardwares
  5. Acknowledgements

1. Research Objective

This research aims to provide RT (Robot Technology) for ADL (Activities of Daily Living) support. Especially, the main target is to evaluate the effect of embodyment and expression of whole body humanoid robot during interaction with human.

Fig. 1.1 shows the application of RT services supporting ADL in human-living environment. Generally, users usually do not have professional knowledge about robots. It is necessary to achieve intuitive interaction with the robots. Therefore, we propose the whole body humanoid robot with following functions:

Fig. 1.1 RT supporting ADL

WE series with facial expression capability were developed in our laboratory from 1995. The humanoid robot WE-4RII(Fig. 1.2) with upper body and facial emotion expression capability was developed by 2004. Additionally, WABIAN series were developed from 1996. By 2007 WABIAN-2R(Fig. 1.3) was developed, whose dimensions and proportions of limbs are based on the average of Japanese adult females. It is able to simulate human movements such as walking and dancing.

In 2007 KOBIAN was developed by integrating WE's upper body and WABIAN's lower body. In 2011, the development of small motor controller modules enables increment of degrees of freedoms (DoFs) in the head part. KOBIAN-R with an expressive robotic head that has 24 DoFs was developed with the cooperation of cartoonists. In 2013, KOBIAN-RII with an expressive robotic head that has mechanisms enabling the facial expressions with cartoon marks was developed. In 2014, KOBIAN-RIII with new arm mechanisms that have additional 2-DoFs on shoulder base parts and capability of high speed and wide motion was developed. In 2015, KOBIAN-RIV with new wrist mechanisms that have capability of high speed and wide motion was developed.

Fig. 1.2 WE-4RII

Fig. 1.3 WABIAN-2R

2. Hardware Overview

In 2015, the whole body emotion expression bipedal humanoid robot KOBIAN-RIV was developed. It has 64-DoFs and 4 sensors which serve as sense organs for the robot. It also has a PC, batteries mounted in its body and a display to show internal state of robot on its chest. The total weight of KOBIAN-RIV is 63 kg.

Fig. 2.1 KOBIAN-RIV (2015)

Fig. 2.2 DoF configuration.

 Degrees of Freedoms (DoFs)
 Head27
 Neck4
 Waist2
 Trunk1
 Arms6×2
 Wrists3×2
 Legs6×2
 Total64
 Sensors  CMOS camera
 Capacitor microphone
 Force sensor
 Gas sensor
 6-axis force/torque sensor
 Photo sensors
 Magnetic encoders
 Display Devices  Display
 Batteries  Li-ion battery
 Weight kg  63

Related papers
[1] Nobutsuna Endo and Atsuo Takanishi, "Development of Whole-body Emotional Expression Humanoid Robot for ADL-assistive RT services," Journal of Robotics and Mechatronics Vol.23 No.6, Fuji press, December 20, 2011.

The head is equipped with 27-DoFs (8-DoFs for eyebrows, 8-DoFs for eye and eyelids, 7-DoFs for lip, and 1-DoF for jaw) to perform facial expressions. The head size is as large as an adult Japanese female, and all mechanisms necessary to move the head parts (eyebrows, eyelids, etc.) are placed inside the head. To express emotions using cartoon marks, KOBIAN-RII uses five Full color LED displays. Thanks to these, KOBIAN-RII is able to display cartoon marks similar to those used in Japanese comics. Those marks were shown to be useful to increase the recognition of the robot emotions by Japanese users.

Fig. 2.3 Isometric projection.

Fig. 2.4 DoF configuration.

Fig. 2.5 Mechanism location.

Fig. 2.6 Mechanisms for cartoon mark expression.

Related papers
[1] T. Kishi, et al., “Development of Expressive Robotic Head for Bipedal Humanoid Robot," Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4584-4589, Vilamoura, Algarve, Portugal, October, 2012.
[2] T. Kishi et al., "Development of a comic mark based expressive robotic head adapted to Japanese cultural background", Proceedings of the 2014 IEEE Intelligent Robots and Systems(IROS 2014), pp. 2608-2613, Chicago, USA, September 2014.
[3] 岸他, “顔面各部の広い可動域および顔色により豊かな表情表現が可能な 2 足ヒューマノイドロボット頭部の開発,” 日本ロボット学会誌,Vol. 31,No. 4,pp. 106-116,2013年5月.

(1) Eyebrow, Forehead

The eyebrow is molded with Septon® produced by KURARAY CO.LTD There are four equally spaced points used to control each eyebrow. The control points are driven by magnets through the cover, placing these on the inner part of the cover.

The facial color is controlled using a blue EL (Electro Luminescence) sheet placed on the forehead under the cover.

Fig. 2.7 Mechanism of eyebrow.

Fig. 2.8 Motion of eyebrow.

(2) Eyelid

The eyelid consists of the upper eyelid and the lower eyelid, which are independent. The upper eyelid has 1-DoF for opening and closing on each side and 1-DoF for rotating, while the lower eyelid has also 1-DoF for opening and closing on each side.

The rotating motion of the upper eyelid is done by a wire driven mechanism with an ultrasonic motor.

Fig. 2.9 Mechanism of eyelid.

Fig. 2.10 Motion of eyelid.

(3) Eye

The eyes of KOBIAN-RII have 3-DoFs (joined eyes pitch and individual eye yaw) driven by a gimbal structure.

Fig. 2.11 Mechanism of eyes.

(4) Mouth

The lip is also molded with Septon® produced by KURARAY CO.LTD. The lip changes its shape pulled from 5 directions. There is one fixed point on the cover of cheek, two active control points on the upper lip and two active control points on the lower lip. Active control points can move in Y and Z directions.

Fig. 2.12 Mechanism of mouth.

Fig. 2.13 Motion of mouth.

(5) Cartoon expression mechanism (sheet)

Black lines and black wrinkles are exposed by moving flexible sheets behind the white cover (Fig.2.14). The forehead is a 7-layer structure, in which are included a LED display, a light diffusion sheet, a LED display for cartoon marks and an EL sheet for complexion expression. These layers are sandwiched between the outer and inner layers of the cover (Fig.2.15).

Fig. 2.14 Mechanism of cartoon expression.

Fig. 2.15 Layer structure of forehead.

(6) Cartoon expression mechanisms (LED display)

Five LED displays are implemented (Fig.2.16). Thanks to the use of a flexible display board, it can be deformed along the outer cover of the face. In addition, thanks to the world's smallest class of full-color LED, cartoon marks can be displayed in high resolution (Fig.2.17).

Fig. 2.16 Placement of LED displays.




Fig. 2.17 LED display.

2.2 Neck

The joint between the neck and the head has pitch and yaw axes, while the joint between the torso and the neck has pitch and roll axes. Similarly as the WE-4RII, the upper and lower extremes are the ones equipped with the pitch axes to enable the motion of sticking out and pulling in the neck when fixing the face orientation.

Fig. 2.18 Neck.

2.3 Arm

KOBIAN-RIV has a "Shoulder Base Roll" joint that moves shoulder part up and down and a "Shoulder Base Yaw" joint that moves shoulder part back and forth. Totally, 9-DoFs arm with fast and wide motion capability was developed. (Fig.2.19).

Fig. 2.19 Arm of KOBIAN-RIV

(1) Wrist driven by parallel link mechanisms with rotary joints

A wrist joint driven by parallel link mechanism with rotary joints was developed. Parallel link mechanisms have higher output than serial link mechanisms. Additionally, parallel link mechanism with rotary joints has wider movable range than the parallel link mechanism with other types of joints. With this kind of mechanism, the CoG (Center of Gravity) of the forearm can be placed near elbow and it reduces the required power of the upper arm.

Fig. 2.20 Parallel link mechanism with rotary joints

Fig. 2.21 Wrist mechanism

Wrist motion

Wrist joint can move fast.

2.4 System configuration

The system configuration of KOBIAN-RIV is presented on Fig. 2.22. KOBIAN-RIV's control system is a hybrid of a centralized control system that drives the leg and most of body and arm joints and a distributed control system that drives the head, wrist and several arm joints. A main PC (CPU: Pentium M 1.6GHz, RAM: 2GB, OS: QNX) is placed on its back. The distributed control system is configured with motor controller units developed in our lab. 8 units are equipped to drive the head part, and 2 units are equipped to drive the wrist joints.

Fig. 2.22 System configuration

(1) Motor controller unit

We developed a motor controller unit composed of the controller module and the motor driver modules. Changing the motor driver modules allows control of various components and decreases debugging time of the communication and control sub-systems. The motor controller can control 4 brushed DC or ultrasonic motors and read analog sensors from 8 extra channels. Its dimensions are 30 x 46 x 18 mm. It is very small considering all its functions. Furthermore, KOBIAN-RIII's head is equal to adult female's head size because the motor controller units are small.

Fig. 2.23 Motor controller unit.

Fig. 2.24 Schematic view of motor controller unit.

CPU STM32F103VG
 Motors  4 DC/ ultrasonic motors
 Voltage [V] 8〜52
 Control modes  Current/velocity/position
 Sensors [ch]  4 encoders
 4 photosensors
 4 current sensors
 4 A/D converters
 Size [mm]  46 x 30 x 18
 Weight [g]  20

Related papers
[1] T. Otani, et al., “Development of Distributed Control System and Modularized Motor Controller for Expressive Robotic Head," Proceedings of the 19th CISM-IFToMM Symposium on Robot Design, Dynamics and Control (ROMANSY2012), pp. 183-190, Paris, France, June, 2012.


3. Research of human-robot interaction with KOBIAN

3.1 Research of making laugh

We are focusing on laugher as a good example of human-robot interaction,because laughter is an involuntary behavior that is possible to detect physiologically, and is said to have health benefits. So far, we have been working on a study to induce laughter in humans by whole body motions by KOBIAN.

(1) Laughter induction of biped humanoid robot (2013)

For human-robot interaction, it is required to interact with each other with an idea about our partner's psychological state (theory of mind). This research objective is to influence the robot's human partner by making him or her laugh with KOBIAN. Experimental results show the realization of human laughter by KOBIAN. In addition, the decrease of "Depression Dejection" and "Anger Hostility" (POMS parameters) in the subject's state of mind has been confirmed.

Fig. 3.1 A short skit of KOBIAN

Skits movies of KOBIAN.

 KOBIAN plays a skit

Related papers
[1] Kishi et al.,“Bipedal humanoid robot that makes humans laugh with use of the method of comedy and affects their psychological state actively ”,Proceedings of the 2014 IEEE International Conference on Robotics and Automation, pp. 1965-1970, Hong Kong, May 2014-June 2014.

(2) Effect of speed and movable range of arm to provoke human laugh (2014)

We examined the impact of the speed and range of movement of the arm of the robot on the funniness of motions to induce human-laughter. The results of these evaluation experiments showed that faster and wider motion of the arm makes increases the funniness.

Moving of KOBIAN-RIII

KOBIAN-RIII can move arms with high speed.

Related papers
[1]Kishi et al., “Development of a Humorous Humanoid Robot Capable of Quick-and-Wide Arm Motion,” IEEE Robotics and Automation Letters, Vol. 1, Issue 2, pp. 1081-1088, July 2016.

(3) The relationship between the speed of motion and funniess (2015)

We examined the relationship between speed of the robot’s motion in skits and subjective funniness. The experiment was conducted to compare the subjective funniness of comedy skits performed by robot (originally done by human comedians) in different stream speed. The stream speeds were ×1/3, ×1/2, ×1 (normal speed), ×2 and ×3. The result shows that the skits were the most funny when the skits were streamed in the speed which was twice as fast as original skit done by human comedians. Actually, KOBIAN-RIV can move in this speed.

Motion of KOBIAN-RIV

We achieved fast wrist motion

The robot motions streamed in different speeds

Too fast motion wasn't funny.

Related papers
[1]柳野他,“笑いを通じた人間とロボットのインタラクションに関する研究(第2報:高速動作が可能な腕部の開発および動作速度と面白さの関係)”,第33回日本ロボット学会学術講演会,3J1-06,東京都,2015年9月.

(4) Robots’ behavior based on temperamental traits(2015)

Four characters were defined based on temperamental traits of human. They are “Phlegmatic”, “Sanguine”, “Choleric” and “Melancholic.” “What’s in the box” was selected to perform them. The behavior features of these four characters were extracted from performances by actors. As a result, posture, speed and range of motion and facial expressions are especially essential for performing these characters. Robot’s performance based on the temperamental traits were made considering these features. The examination result about the funniness shows that contradictory performances with positive reactions to the negative situations by “Phlegmatic” and ”Sanguine” give funny impression to the subjects.

"Melancholic" personality

“Choleric” personality

“Phlegmatic” personality

“Sanguine” personality

Related papers
[1]岸他,“笑いを通じた人間とロボットのインタラクションに関する研究(第3報:気質に基づいたロボットのキャラ表現)”,第33回日本ロボット学会学術講演会,3J1-07,東京都,2015年9月.

(5) Automatic generation of exaggerated motion trajectory (2016)

Refer to the method proposed in the relevant research in CG animation field, novel algorithm is developed that can automatically generate exaggerated trajectory from reference trajectory. Through this algorithm, "take back" and "follow through" is added to reference trajectory. The result of the experimental evaluation shows that the trajectory generated using this algorithm looks faster, wider and significantly funnier than reference trajectory.


Fig. 3.2 Reference hand trajectory without "exaggeration"

Fig. 3.3 Exaggerated hand trajectory generated by proposed method

Comparison of the impression between exaggerated and reference motion.

In comparison with reference trajectory (left side), exaggerated trajectory (right side) looks more dynamic.

Related papers
[1]岸他,“笑いを通じた人間とロボットのインタラクションに関する研究(第7報:刺激の入力に対するロボットの誇張したリアクションの生成)”,第34回日本ロボット学会学術講演会,1W2-05,山形県,2016年9月.

(6) Expression of the real feeling with the display in the chest (2017)

The real feeling that a human being has doesn't always agree with the facial expression. We expressed the robot's real feeling with the display in its chest. The way of expression was inpired by the shapes of the dialogue ballons and the backgrounds used to express emotions and feelings in Japanese comics and cartoons. Based on Russell's emotional model, the real feeling was defined by "activaton" and "pleasantness". Activation was expressed by the speed of the objects moving around in the display, and pleasantness was expressed by their shapes and colors.

Emotional expression with the display and combinations of the display and the facial expressions

When the emotional expression in the display is opposite to the facial expression, the robot makes a fresher impression.

Related papers
[1]加藤他,“笑いを通じた人間とロボットのインタラクションに関する研究(第8報:ロボットの胸部ディスプレイによる本心の表現)”,第35回日本ロボット学会学術講演会,1I2-05,埼玉県,2017年9月.

(7) Development of 1-DOF robot hand that rub the side

There are not only visual ways to make people laugh, but also tactile ways such as tickling. Mechanism for making people laugh by tickling have many unexplained parts. So we developed a robot that rub the side of the torso to tickle, which is a reproducible way of tickling. From the evaluation experiments, we we able to make humans laugh using this tickling robot,.



Fig. 3.4 Overview of robotic hand.

Fig. 3.5 Mechanism of robotic hand

Motion of robotic hand

Related papers
[1]Kishi et al.,"One DoF Robotic Hand That Makes Human Laugh by Tickling Through Rubbing Underarm", Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 404-409, Daejeon, Korea, October 2016.

(8) Development of Rib-tickling robot (2016)(2016)

In order to make humans laugh more efficiently by tickling of robots, diversification of type of tickling stimuli is important. Therefore, the robot that tickle human through stroking on the rib surface was developed. The output power from fingertip can be adjusted by changing the initial length of the spring installed in the finger. In addition, magnets are installed in the finger. When there is unexpectedly large force input at fingertip, these magnets are detached and the force input will not be transmitted. It works as a safety mechanism.


Fig. 3.6 Overview of rib-tickling robot

Fig. 3.7 Mechanism of rib-tickling robot

Motion of rib-tickling robot

Related papers
[1]岸他,“笑いを通じた人間とロボットのインタラクションに関する研究(第4報:肋骨の撫でを通じたくすぐりロボットの開発)”,日本IFToMM会議シンポジウム前刷集(第22回),pp. 43-50, 東京都,2016年9月.

(9) Development of Sole-tickling robot (2016)

In order to make human laugh more efficiently by tickling by robots, diversification of tickling position on the body is important. Therefore, the robot that tickle humans through stroking on the sole surface was developed. The robot can tickle whole sole surface by moving its roller forward and backward. The robot is able to tickle sole surface with constant force even if the direction of sole changes. This is because the roller is pushed against sole surface by constant force springs.


Fig. 3.8 Overview of sole-tickling robot

Fig. 3.9 Mechanism of sole-tickling robot

Motion of sole-tickling robot

Related papers
[1]岸他,“笑いを通じた人間とロボットのインタラクションに関する研究(第5報:足裏の撫でを通じたくすぐりロボットの開発)”,LIFE2016,pp. 111-114,宮城県,2016年9月.

3.2 Reasearches about emotion

(1) Emotional Expression

We use the Six Basic Facial Expressions of Ekman in the robot's facial control, and have defined the seven facial patterns of "Happiness", "Anger", "Disgust", "Fear", "Sadness", "Surprise", and "Neutral" emotional expressions. KOBIAN-RII can express these with its whole body.

(a) Happiness (b) Fear
(c) Surprise (d) Sadness
(e) Anger (f) Disgust
(g) Neutral

Fig. 3.10 Emotion expression of KOBIAN,Facial expression of KOBIAN-RII.

Movie of emotion expression


 KOBIAN-R expresses emotion using facial expression.



KOBIAN expresses emotions using whole body.


Related papers
[1] M. Zecca et al., “Design of the humanoid robot KOBIAN - preliminary analysis of facial and whole body emotion expression capabilities-” Proceedings of the 8th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2008), pp. 487-492, Daejeon, S. Korea, December, 2008.

(2) System of Facial expression generation(2012)

There is a need of extending the rigid concept of patterns based on only 6 emotions. A Facial expression generation system applied to humanoid robot KOBIAN-R, based on an extension of Plutchik’s model of emotions and on polynomial classifiers, was developed. It can produce thousands of combinations of facial cues to represent expressions of composite emotions and communication acts, including asymmetrical expressions.

Fig. 3.11 Process of facial expression generation.

Related papers
[1] G. Trovato et al., “Generation of Humanoid Robot's Facial Expressions for Context-Aware Communication," International Journal of Humanoid Robotics, Vol. 10, Issue 01, 23 pages, March, 2013.
[2] G. Trovato et al., “Development of Facial Expressions Generator for Emotion Expressive Humanoid Robot," Proceedings of the 2012 IEEE-RAS International Conference on Humanoid Robots, pp. 303-308, Osaka, Japan, November, 2012.
[3] G. Trovato et al., “Evaluation Study on Asymmetrical Facial Expressions Generation for Humanoid Robot," Proceedings of the 1st International Conference on Innovative Engineering Systems, pp. 129-134, Egypt, December, 2012.

(3) Cultural differences of emotion recognition with cartoon marks(2012)

A cultural gap in recognition of facial expression was found. A further study on culture-dependent different generation of facial expression was done and the innovation of the use of Japanese comic symbols to display on the face was introduced. Symbols displayed on face are an additional channel of communication. They can enhance recognition rate of Japanese but they may not always work for Westerners.

Fig. 3.12 Facial expressions with cartoon marks

Related papers
[1] G. Trovato et al., “A Cross-Cultural Study on Generation of Culture Dependent Facial Expressions of Humanoid Social Robot," Proceedings of the 4th International Conference on Social Robotics, pp. 35-44, Chengdu, China, October, 2012.
[2] G. Trovato et al., “Cross-Cultural Perspectives on Emotion Expressive Humanoid Head: Recognition of Facial Expressions and Symbols," International Journal of Social Robotics, Vol. 5, Issue 4, pp. 515-527, November, 2013.

(4) Emotional walking (2013)

Emotions are not only expressed through facial expressions or voice (tone and speech). Nonverbal behavior such as gait influence greatly the perception of emotions. For our research we captured motions with different emotions and emotional intensities of professional actors and non-actors subjects. We extracted emotional gait parameters from the captured data and we modeled those parameters to work our Emotion Mental Model. We assessed both in simulation and with the real robot. Subject recognized the expressed emotions quite well and appreciated the emotionalism of the robot.

Happiness walking.

Sadness walking.

Fig. 3.13 Emotional walking.

Related papers
[1] M. Destephe et al., “Emotional Gait Generation Method based on Emotion Mental Model - Preliminary experiment with Happiness and Sadness -," Proceedings of the 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI2013), pp. 86-89, Jeju, Korea, October, 2013.
[2] M. Destephe et al., “Conveying Emotion Intensity with Bio-inspired Expressive Walking -Experiments with Sadness and Happiness-," Proceedings of the 22nd IEEE International Symposium on Robot and Human Interactive Communication, pp. 161-166, Gyeongju, Korea, August, 2013.
[3] M. Destephe et al., “The Influences of Emotional Intensity for Happiness and Sadness on Walking," Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 7452-7455, Osaka, Japan, July, 2013.
[4] M. Destephe et al., "Improving the Human-Robot Interaction through Emotive Movements - A Special Case: Walking -," Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction, pp. 115-116, Tokyo, Japan, March, 2013.

(5) Difference of impression in cross-cultural greeting using humanoid robot(2013)

How are robots perceived when speaking / using gestures that belong to a certain national culture? A greeting interaction between Egyptians and Japanese with two versions of KOBIAN (one Japanese, one Arabic speaking). Culture-dependent acceptance and discomfort were found: Egyptians prefer the “Arabic robot”, and report discomfort when interacting with the “Japanese robot”; the other way round for Japanese subjects.

Greeting of Japan

Greeting of Egypt

Fig. 3.14 Greeting of KOBIAN

Related papers
[1] G. Trovato et al., “Cross-cultural study on human-robot greeting interaction: acceptance and discomfort by Egyptians and Japanese," Journal of Behavioral Robotics, 11pages, October, 2013.
[2] G. Trovato et. al, “Towards Culture-specific Robot Customisation: A Study on Greeting Interaction with Egyptians," Proceedings of the 22nd IEEE International Symposium on Robot and Human Interactive Communication, pp. 447-452, Gyeongju, Korea, August, 2013.

(6) Dynamic emotional expression based on the mental model(2013)

This research is the implementation in a walking humanoid robot of a mental model, allowing the dynamical change of the emotional state of the robot based on external stimuli; the emotional state affects the robot decisions and behavior, and it is expressed with both facial and whole-body patterns. To evaluate the importance of the proposed system in the framework of human-robot interaction and communication, we conducted a survey by showing videos of the robot behaviors to subjects. The results show that the integration of dynamical emotion expression and locomotion makes the humanoid robot more appealing to humans, as it is perceived as more "favorable" and "useful", and less "robot-like."

Front

Over view

Fig. 3.15 Emotional expression during visual tracking (surprise).

Movies of emotion expression during visual tracking


 KOBIAN expresses emotion during visual tracking.


Related papers
[1] T. kishi et al., “Impression Survey of the Emotion Expression Humanoid Robot with Mental Model based Dynamic Emotions," Proceedings of the 2013 IEEE International Conference on Robotics and Automation, pp. 1655-1660, Karlsruhe, Germany, May, 2013.

3.3 Research of Visual Sensor

(1)Visual tracking based on the vestibulo-ocular reflex with biped walking humanoid robot(2010)

Personal robots will become more and more popular in the future and will be required to actively collaborate and live with their human partners. These personal robots must recognize changing environments and must conduct adequate actions like humans. Object tracking can be said to be a fundamental function from the view point of environmental sensing and reflex reaction against it. We developed an object tracking motion algorithm by using the upper body. Then, we integrated it with an online walking pattern generator and developed an object tracking biped locomotion algorithm. Finally, we conducted an experimental evaluation and confirmed its effectiveness.

Fig. 3.16 Visual tracking.

Related papers
[1] N. Endo et al., “Integration of Emotion Expression and Visual Tracking Locomotion based on Vestibulo-Ocular Reflex," Proceedings of the 19th IEEE International Symposium on Robot and Human Interactive Communication, pp. 593-598, Viareggio, Italy, September, 2010.

(2)Method to select a comfortable walk orbit(2013)

In order for a robot to be able to navigate unknown environments, it is necessary for it to actively scan the environment with its sensors. This is important both for minimizing perception uncertainty and the probability of collisions. We proposed a gazing strategy whose objective is to minimize the uncertainty of the 3D reconstruction of the environment along the planned trajectory of the robot. We also proposed a new 3D reconstruction approach for stereo camera systems which better integrates measurements through time. That is done by a novel formulation of occupancy grids which includes all information retrieved from stereo.

Fig. 3.17 Method to select a comfortable walk orbit

Related papers
[1] M. Brandao et al., “Active Gaze Strategy for Reducing Map Uncertainty along a Path," Proceedings of the 3rd IFToMM International Symposium on Robotics and Mechatronics (ISRM 2013), pp. 455-466, Singapore, October, 2013.
[2] M. Brandao et al., “Integrating the whole cost-curve of stereo into occupancy grids," Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013), pp. 4681-4686, Tokyo, Japan, November, 2013.

4. Our Previous Hardwares

KOBIAN/HABIAN (2009)
  • Whole body emotional expression robot KOBIAN was developed.
  • Emotional expression recognition rates of WE-4RII and KOBIAN were compared.
  • Emotional expression recognition rates with whole body expression were investigated.
  • Hands with soft material WSH-1 and WSH-1R were developed
  • Emotional expression robot on wheels HABIAN was developed
    and emotional expression recognition rate was compared between the case on foot and on wheels.
  • Details

KOBIAN-RIII (2014)
  • Development of the robot KOBIAN-RIII with fast arm motion capability like human comedians.
  • Higher output of the arms was realized by installing the joint mechanism driven by two parallel motors.
  • Lighter weight of the arms was realized by the joint mechanism driven by flexible shaft.
  • The results of experimental evaluation showed that faster and wider motion of the arm contributes the funniness.
  • Details

All activity

5. Acknowledgements

This work is conducted under the strategic project of developing advanced robotics technology (U.Research project on service robot<2> communication RT system for elders) of NEDO.

We gratefully acknowledge the support for this work from the "Humanoid Research Institute (HRI)" at Waseda University, RoboCasa, the Global COE Program (Global・Robot・Academia), RoboSoM project from the European FP7 program (ICT-2009-4).

Finally, we also thanks to the SolidWorks Japan K.K.tmsuk Co., Ltd.DYDEN CorporationNikkiFron Co.Kuraray Co, LtdChukoh Chemical IndustriesSTMicroelectronics Co.Waseda Research Institute for Science and Engineering.

Humanoid Robotics Institute, Waseda University
WABOT-HOUSE Laboratory, Waseda University
SolidWorks Japan K.K.
The New Energy and Industrial Technology Development Organization (NEDO)
tmsuk Co., Ltd.
DYDEN Corporation
NikkiFron Co.
Kuraray Co, Ltd
Chukoh Chemical Industries
STMicroelectronics Co.
Back
Last Update: 2016-10-24
Copyright(C) 2016 Takanishi Laboratory
All Rights Reserved.