
Up to now, the Waseda Flutist Robot has acted mainly as a passive player. While it was able to play a static score together with a human player, it could not adjust its play to the performance of its parter musician in real-time. Recently we have been working on developing an interaction system that allows the robot to detect visual queues from the other members of an ensemble. As far as the research has progressed until now, in this context, we mainly concentrated on detecting visual queues. We examined the case of the robot being part of a Jazz band. In that situation the band`s performance of a song relies on large parts on the concept of improvisation. | ||||||||||||
![]() | ||||||||||||
Why Research on Musical Robots?
![]()
|

Visual Tracking We experimented with different video analysis methods to find out which approach is best suitable for the application of visually tracking the instrument movements of human musicians. As a result we now primarily utilize two methods that seem to be the most efficient for our case: |
Motion Tracking
|
![]() |
Object Tracking using a Partcile Filter
|
![]() |

![]() |
Interaction with the WF-4RIV (MPEG / 1:36 / 16MB) |

A part of this research was conducted at the Humanoid Robotics Institute (HRI), Waseda University, and part of this was supported by a Grant-in-Aid for the WABOT-HOUSE Project by Gifu Prefecture. |
Humanoid Robotics Institute, Waseda University | ![]() |
WASEDA UNIVERSITY WABOT-HOUSE LABORATORY | ![]() |