Japanese / English
Motivation and Research Objective
A Musical-based Interaction System (MbIS) that allows human musicians to interact with the Flutist Robot WF-4RV is proposed:
-
Focus on musical interaction with the purpose of creating novel ways of musical expression.
-
Flutist robot WF-4RV is not merely used as an instrument-playing machine, but it is enabled to act as a musical partner (Figure 1).
Figure 1 - Active performance with a human band leads to a positive impression on the audience.
This is achieved by processing acoustic as well as visual sensor input and by mapping those incoming inputs into musical outputs. The system has specific novelties that separate it from other musical interaction systems:
-
Interaction through multiple perception channels (multi-modal interaction): Visual and acoustic recognition system.
-
Musical performance by an anthropomorphic robot. Human way of flute play is closely emulated.
-
Physical constraints of the flutist robot give feedback to the interaction.
-
Interaction is adjustable to skill and experience level of the human partner.
Musical-based Interaction System
The system consists of several modules with different tasks:
-
For each task, there is one specific module that analyses the output from the camera and microphone of the robot and maps the extracted information to parameters that modify the musical performance.
-
From these parameters MIDI data is generated and sent to the robot.
-
The robot itself has a motor control module that receives the MIDI information and adjusts the movement of the motors accordingly.
-
System is separated into two skill / experience level stages: Basic and extended level interaction system (Figure 2 and 3).
Figure 2 - Overview over the Musical-based Interaction System (MbIS).
Figure 3 - The hardware setup underneath the interaction system.
Basic Interaction Level
The basic level of interaction has the following characteristics:
-
Virtual faders and buttons are used to sense the instrument movements of the robot’s partner musician.
-
The data sent from the input sensor processing to the robot control module is modulated by the physical state of the robot.
-
We use histogram-based audio analysis to verify, if the performance output of the robot is according to the intended result (Figure 4).
Figure 4 - Principle of operation of the basic interaction level.
Extended Interaction Level
In the extended level interaction interface, our goal is to give the user the possibility to interact with the robot more freely (compared to the basic level):
-
We propose a teaching system, that allows the user to link instrument gestures with musical patterns.
-
We allow for more degrees-of-freedom in the instrument movements of the user. As a result this level is more suitable for advanced level players.
-
For this task we use a particle filter-based instrument gesture detection system.
-
A Bayesian mapping algorithm is employed in order to ensure, that if the teaching musician does not account for all combinations of instrument orientation and musical output in the teaching phase, in the performance phase the robot will automatically play the most closely matching answer modulation to a given instrument state.
-
In both interaction levels (basic and extended) the proposed controllers can be used to control the onset of the musical performance (Figure 5).
Figure 5 - Principle of operation of the extended interaction level.
Performance Movies
|
Basic level interaction performance movie (right-click to download movie) |
Extended level interaction performance movie (right-click to download movie) |
Special Thanks
This research was conducted at the Humanoid Robotics Institute (HRI), Waseda
University. It was and is supported by a Grant-in-Aid for the WABOT-HOUSE
Project by Gifu Prefecture and the RT-GCOE Global Robot Academia of Waseda
University.
WF-4RIV has been designed by 3D CAD software "SolidWorks". Special thanks to
SolidWorks Japan K.K. for the software contribution.
Humanoid Robotics Institute, Waseda University |
Wabot-House Laboratory, Waseda University |
Global Robot Academia RT-GCOE of Waseda University |
SolidWorks Japan K.K. |
(c) 2010 Takanishi Laboratory