Dept. of Music, Brown Univ., Box 1924, Providence, RI 02912
The proliferation of reliable interactive computer music systems has created opportunities for performers to directly influence computer music processes. Performing musicians are highly valued for their unique sense of musical expression and taste: the ever-changing subtleties of tempo, dynamics, timbre, and articulation encompassed in the larger musical framework of musical gesture and phrasing. How can musicians communicate their highly refined skills to a computer, and elicit similarly musical results? This paper describes techniques whereby performers influence highly flexible compositional algorithms that are subtly responsive to musical nuances. These algorithms create MIDI data used to control synthesizers and signal processors. Musical examples will be demonstrated using FollowPlay, a computer program for interactive music and a real-time environment for music composition. The program consists of a large collection of software modules organized into three functional types: Listener Objects analyze and record aspects of a musician's performance, Composition Objects respond by generating MIDI data, and Interpreter Objects unify the entire collection with a graphical user interface that handles timing and intermodular communications.