Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Automatic Improvisation: A Study in Human/Machine Collaboration

Abstract

Described herein is a practical experiment in the spontaneous composition of music through real-time interaction between a human composer-performer and a semi-autonomous network of computer processes. A software system called The Duoquadragintapus was devised for this purpose, and used in performance to create and record musical variants of the same name, bounded by the resources and behaviors of generative processes allocated to the system by its author, who also occupies the role of composer-performer when the system is active. The Duoquadragintapus includes pre-composed music that can be incorporated at the whim of its human collaborator, as well as a number of independent processes designed to transform material derived from live instrumental performance and navigate relationships between live input and pre-rendered output. The human performer has two responsibilities: first, as an improvising instrumentalist, providing raw material for real-time software processes, the functioning of which remain independent of his control during performance; secondly, as a software engineer and composer, determining the boundaries of the system prior to performance and selecting when and how many of the computer system's various process classes are allowed manifest instances during performance. In this version of the The Duoquadragintapus, the performer uses a fretless electric guitar as an input instrument, and music is output through a battery of 32 percussion-playing robots and a 5.1 surround sound speaker array (though the system can easily be adapted to accommodate other types of input and output). High-level system control originates in MIDI information from the guitar and from a footswitch controller. The major components of the system include signal processing of audio from the guitar, factor oracle generation of accompaniment harmonies, factor oracle selection of pre-composed audio snippets generated via sonification of L-systems, phrase detection and rhythmic interpolation of phrases, a "lockstep" mode in which percussion robots mirror the rhythms played by the guitarist, and pre-scored control of performance variables such as changes in probabilities governing factor oracle traversal, choices for accompaniment voicing, and changes in ownership of parameter control from human to computer.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View