Skip to main content
eScholarship
Open Access Publications from the University of California

UC Riverside

UC Riverside Electronic Theses and Dissertations bannerUC Riverside

Gameplay as Discrete Form: Leveraging Procedural Audio for Greater Adaptability in Video Game Music

Abstract

Video game music has finally begun to be taught in academic multimedia composition programs. This is an exciting prospect, but video game music is then consistently lumped into the same category as film and television. This is an egregious error. The types of music needed for the game industry is more extensive than the traditional orchestrations that currently dominate linear media - video game music can range from death metal, to electronic dance music, to synth pop. This is not the only classification problem. Video game design creates an intrinsic problem in the music-making process - since game designers will never know when a player will do an action or move to a new section, the composer cannot write pieces with discrete temporal form like the silver screen. Instead, they have to leave certain musical elements up to an inherent level of indeterminacy. Fixed audio recordings have made this flexibility almost impossible throughout the past two decades. As the level of immersivity increases for game design at an exponential level with the development of comprehensive game engine visuals, adaptive video game music has begun to plateau due to the lack of improvement in audio engines over the last decade. This is where the significance of procedural audio becomes relevant. This dissertation discovers and programs new methods of allowing future video game composers to procedurally generate their musical vision inside of game engines so that elements such as timbre, melodic density, and rhythmic intensity (among others) can be driven and altered by player choices rather than relying on the stiff nature of pre-recorded audio.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View