Sunday, November 8, 2009
Robot Drummer Jams Live with Human Musicians
Credit: Georgia Tech
Georgia Tech’s music technology program is pushing the envelope in artificial intelligence research and human-robot interaction. Director of Music Technology Gil Weinberg has created a robot that is not only changing the way we think about music, but also advancing research about the human mind and the way in which we may interact with robots in the future. Nanotechnology is involved in the computer chips used in the robot.
The robot, Haile, analyzes music based on human perception models and improvises algorithmically using unparalleled cognitive and physical abilities. Haile can therefore create a novel kind of human-machine interaction that can lead to new insights about music perception, improvisation, and collaboration.
Haile's uniqueness lies in its ability to play acoustically with a vibrant sound while utilizing perceptual and improvisational algorithms. The robot can listen to live players, analyze their music in real-time, and use the product of this analysis to play back in an improvisational manner. It is designed to combine the benefits of computational power with the richness, visual interactivity, and expression of acoustic playing. See Haile in Action (4.9M .mov)