Classic Computer Magazine Archive COMPUTE! ISSUE 136 / DECEMBER 1991 / PAGE 114

The infinite crescendo. (electronic music)(includes related article)
by Chantelle Oligschlaeger

Sir Edward Marsh, an English art patron, was referring to Georgian poetry when he penned those words, but you could easily say the same about music. Since Leon Theremin invented the first electronic instrument in 1920, music has changed in ways never before imagined. And the movement that computers began is still playing its overture.

The electronic music revolution is turning science fiction into science reality in the labs of Stanford University's Hugh Lusted and San Jose State University's Ben Knapp. In 1987, the two researchers brought together their visions of a device that would produce music from the electrical activity in muscles. Their creation: the Biomuse, an apparatus that uses a headband and a muscleband to sense bioelectrical activity, which, in turn, is sent to a computer to produce musical signals.

The headband eye controller detects the direction in which your eyes are moving, and a muscleband controller, which you wrap around your arm or leg, detects muscle tension. For example, if you're playing a keyboard with both hands, you can send additional commands to the computer or create other instrumental sounds with the headband by moving your eyes or by flexing the banded muscle. The computer is programmed to interpret the information and send a command to the synthesizer, which results in music.

The Biomuse could be a godsend for physically disabled musicians. Lusted and Knapp introduced it to patients last March at Loma Linda Hospital in Southern California. One man, a former computer programmer without full use of either his arms or legs, was able to play the violin using the remaining control of his upper arm.

Virtual Biomine

Musician Galen R. Brandt sees other possibilities for the Biomuse. She plans to use it and Vivid Effect's virtual reality software, Mandala, in her multimedia show, Let Us Consider the Rising of Dreams. During a performance, a mime might wear a Biomuse band to trigger changes in light and sound while Mandala allows the performers to interact with preprogrammed graphics.

A video camera records the performers' images against a solid background, and those images are sent to a computer via Mandala where they're separated from the background and digitized. The images are then projected into the field of graphics, allowing the performers to "interact" with, say, images of drums, beating them as though they were real.

To pull it all together, Brandt uses multiple MidiTaps controlled by Virtual Studio software. The Miditaps use MediaLink, a multimedia LAN protocol that provides a new way of transmitting digital information, like MIDI, over a high-speed, high-bandwidth, bidirectional network. Each MidiTap, an interface between MIDI and MediaLink, acknowledges that it has received a message and translates the MIDI information for the electronic instrument.

Brandt's show will also be an experiment in audience interaction. You might see yourself on the screen interacting with performers in a story line. Or you could interrupt an infrared beam, triggering an audible response, or you might interact with the show from your wired seat with your own Biomuse device. Brandt hopes audiences will be able to experience and interact with her show in one to two years.

Super Conductor

While the Biomuse interprets a muscle flinch that can help the physically disabled create music or allow an audience to interact with a show's performers, the Radio Baton, developed by Max Mathews of Stanford University, monitors arm gestures, which it uses to conduct music electronically. As he waves a Radio Baton, it sends radio signals to a flat surface below him that senses the location of his hand in space. Once the computer receives the messages, it sends them to a synthesizer called Sound Canvas, or SC-55, "telling" it how he wants the melody to sound.

Cellist Ami Radunskaya, who worked with Mathews at Stanford and is now at Rice University, has written four compositions for the Radio Baton. With her right hand controlling the tempo and her left hand influencing the expressive qualities of the music, Radunskaya can perform her music with all the drama and passion of a symphony conductor. But instead of a live orchestra, her music is preprogrammed using Mathew's Conductor program.

"I like the generality and freedom of it--the theatrical action and expressive capability," she says. Faculty and students at ten colleges, including Radunskaya's Rice, are experimenting and composing music with Radio Batons.

Electronic Session Man

As musical conductors, they're one thing, but computers traditionally haven't been the best accompanists. However, researchers like Roger Dannenberg, senior research computer scientist at Carnegie-Mellon University, are developing software that lets the computer play second fiddle. You can either improvise or play exactly what's on your sheet music instead of trying to play along with tape-recorded electronic music. The computer listens and synchronizes itself with your playing, regardless of your artistic whims. "It is a more live experience if everything is live," Dannenberg says.

The computer uses pattern-matching algorithms to compare your performance to its own, considering all possible matches of pitch. It listens to you through a device that converts instrumental output from pitch to MIDI and applies strategies to maintain synchrony. It can anticipate what you'll do next and then send commands to the synthesizer to play any chosen instrumental sound.

Dannenberg is enhancing the program so it can handle more arbitrary musical patterns and stay synchronized with each member of an ensemble. He hopes to market his program soon.

Hypermelodies

One man who knows what it's like to perform with computers is Tod Machover, associate professor of music and media at MIT. Machover brought his hyperinstruments to the studio with the recording Flora and even to the Paris stage with the opera VALIS. You may have seen him with a conducting glove--a mess of wires called Dexterous HandMaster that looks more like a prop for Terminator 2.

Hyperinstruments enhance your performance by adding "color" to your music. The computer listens to MIDI and other information from your traditional instrument or conducting glove and decides what to do with the sound. It takes the information to a synthesizer to add more notes, for example, or to make the rhythm very precise. The louder you pay, the more a hyperinstrument will add. You might be playing one yourself before too long. Machover expects hyperinstruments to hit the market within two years.

Music Above All

One concern about computer-generated music has been that less skillful artists might use it as a crutch. But the idea behind hyperinstruments is to put the musician in control, not to make up for shortcomings. "Computers should take what you do very well and make that special," Machover says.

Others say that while a computer's precision helps musicians manage complex rhythms, creating music still requires an ear and imagination. "[Computers] are good for a person who doesn't have years and years of practice and technique, but who still wants to enjoy playing," Radunskaya says.

Playing a duet with a computer or conducting with a robotic glove may seem just as alien to us as the electric guitar did to our parents 40 years ago. But with the strides computer music is making, future generations may not be able to imagine a concert without seeing the sweep of Radio Batons, without interacting with the performers, and without expressing themselves with a Biomuse.

Indeed, the ear and the instrument change with the generations.