Hostname: page-component-76fb5796d-vvkck Total loading time: 0 Render date: 2024-04-26T17:11:21.052Z Has data issue: false hasContentIssue false

Control parameters for musical instruments: a foundation for new mappings of gesture to sound

Published online by Cambridge University Press:  17 January 2003

Daniel J. Levitin
Affiliation:
Departments of Psychology and Music Theory, and Centre for Interdisciplinary Research in Music, Media and Tech nology (CIRMMT), McGill University, Montreal, Canada E-mail: levitin@psych.mcgill.ca
Stephen McAdams
Affiliation:
Institut de Recherche et Coordination Acoustique/Musique (IRCAM-CNRS), Paris, France
Robert L. Adams
Affiliation:
University of California at Davis, USA

Extract

In this paper we describe a new way of thinking about musical tones, specifically in the context of how features of a sound might be controlled by computer musicians, and how those features might be most appropriately mapped onto musical controllers. Our approach is the consequence of one bias that we should reveal at the outset: we believe that electronically controlled (and this includes computer-controlled)musical instruments need to be emancipated from the keyboard metaphor; although piano-like keyboards are convenient and familiar, they limit the musician's expressiveness (Mathews 1991, Vertegaal and Eaglestone 1996, Paradiso 1997, Levitin and Adams 1998). This is especially true in the domain of computer music,in which timbres can be created that go far beyond the physical constraints of traditional acoustic instruments.

Type
Research Article
Copyright
© 2002 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)