* The Hyperinstrument system is based on musical instruments that provide various ways for musicians to play music into our computers. The simplest method is via an instrument similar to an existing traditional instrument, such as a MIDI [Musical Instrument Digital Interface] keyboard or percussion controller. More and more, however, we're using extremely sophisticated controllers that monitor hand gestures.
The output of those instruments goes to an Apple Macintosh II computer, the ``brain'' of the Hyperinstrument. There, a special artificial intelligence software environment analyzes and interprets real-time performance data. This environment, ``Hyperlisp,'' is based on the programming languages Allegro, common LISP [the basic computer language for artificial intelligence], and Language, and was developed by Media Lab graduate student and software engineer, Joe Chung (my principal collaborator in Hyperinstrument design). All musical data coming from the live instruments are analyzed and interpreted in real time in the Mac's LISP environment, then turned into MIDI or musical data that is output to a bank of sound-producing devices, MIDI synthesizers, samplers, or more complicated signal-processing devices.
One theory behind the Hyperinstrument development concerns the potential for live performance. Music is a performance art. You can achieve magical results in a recording studio, where you have the chance to redo and overlay parts, but you should be able to accomplish things on stage that are just as wonderful and retain the dimension of direct human expressivity, communication, and spontaneity. To achieve that while performing live, onstage in a concert setting, we need the power of ``smart'' computers following the gestures and intentions of fine performers.