Many of us begin and end our days with music. Our alarm clocks play music in the morning to get us moving, and our iPods play music in the evening to help us relax. We listen to music on our way to work and on our way home. The bottom line: Music is everywhere.
Some of the best musicians have mastered the art of improvisation, and often the music we love the best is performed by a group of improvisational masters working off of each other.
Two artists working with ~ and in some cases, against ~ each other to create is an old tradition. While opinions differ, many find some of the best improv in jazz. Jazz artists don’t just listen, they watch. Artists communicate visually, giving subtle cues, leading their coconspirators through the act. This deep connection between two people who love the same art and communicate through that art seems to be a uniquely human experience.
However, some roboticists think teaching robots to improvise musically with humans can lead to new interesting kinds of music, as well as improving human-robot collaborations.
In order to teach robots to be collaborative partners in music, or any other activity, roboticists like WPI’s Dr. Scott Barton use a variety of artificial intelligence techniques. Barton’s research involves a number of music-playing robots designed to work in harmony with other musicians and their collaborative environments. The intelligence built into these types of machines relies on principals of statistical inference to make sense of the sights and sounds around them.
Statistical inference is a primary technique in many areas of artificial intelligence. It makes sense that we use ideas from statistics in A.I. systems, once you understand that scientists rely on statistics to draw conclusions from unimaginably difficult-to-understand sets of information.
In the same way that scientists use thousands of measurements of air and water temperatures to determine the health of our climate, robots take visual and auditory measurements of thousands of examples of musical collaborations in order to make inferences ~ or guesses ~ at what their musical partners are trying to do. Through these statistical methods, robots, in effect, learn to be collaborators in the human-robot musical collaboration.
These collaborations are not yet perfect, and robots are far from being jazz masters, but engineers continue to refine the techniques of teaching robots to work with people. Perhaps in the near future, robotic musicians will be able to take statistics and predict how to best interact with their human counterparts with the same skill as physicians who use statistics to determine how best to treat their patients.
In the meantime, head over to Dr. Barton’s website (scottbarton.info) and hear what kinds of music these early musical robots are playing.
By R.J. Linton