Viano: Electromyographic Data as a Gestural Tool for Music Composition within Game-EnginesCitation formats

Standard

Viano: Electromyographic Data as a Gestural Tool for Music Composition within Game-Engines. / Rhodes, Christopher.

2019. Abstract from Innovation in Music, London, United Kingdom.

Research output: Contribution to conferenceAbstractpeer-review

Harvard

Rhodes, C 2019, 'Viano: Electromyographic Data as a Gestural Tool for Music Composition within Game-Engines', Innovation in Music, London, United Kingdom, 5/12/19 - 7/12/19.

APA

Rhodes, C. (2019). Viano: Electromyographic Data as a Gestural Tool for Music Composition within Game-Engines. Abstract from Innovation in Music, London, United Kingdom.

Vancouver

Rhodes C. Viano: Electromyographic Data as a Gestural Tool for Music Composition within Game-Engines. 2019. Abstract from Innovation in Music, London, United Kingdom.

Author

Bibtex

@conference{7e46d945a4644b618e220273696d25fe,
title = "Viano: Electromyographic Data as a Gestural Tool for Music Composition within Game-Engines",
abstract = "In recent years, wearable sensors have allowed us to utilise previously inaccessible forms of gestural data for use within interactive music composition and live music performances. In particular, data which measures muscle tension (Electromyographic - EMG). EMG data is interesting to use because it allows for better gestural control when generating a desired sonic output (via Digital Signal Processing - DSP), in comparison to other datasets, such as Electroencephalography (EEG). As a result of this improved gestural control, game-engines can be used as a medium to investigate novel music interactions with digital environments and with virtual instruments. As game-engines have arbitrary physical laws (much different from our own real world), they are ideal to explore the use of EMG data when interacting with digital objects, as well as the resulting sonic consequences. In turn, mechanical instrument design is also less restricted. Therefore, many creative possibilities arise for interactive music composition, the way in which we interact with instruments and instrument design. Moreover, Viano is a live, interactive, game for pianists which aims to study the use of EMG data in music composition within game-engines. Through playing an acoustic piano, the performer affects the timbre of their instrument by interacting with a virtual piano. They also generate sonic material, however, when they play different components of the virtual piano (i.e. performing a 'plucking' gesture to play a virtual string). Furthermore, they are encouraged to play, create and respond to sounds which are produced via this interaction with the virtual instrument. This interaction is made possible by playing various 'stages' of the game through the use of a wearable interface (worn on their right arm) - the Myo armband. Therefore, the Viano project aims to observe: if piano components were digital - what sounds would they make? How would playing with digital piano hammers affect DSP and timbre of an acoustic piano? What extended techniques are possible through digitally augmenting the acoustic Piano instrument? What kinds of interactions will be made possible, with virtual instruments, via the use of EMG data and gestural interfaces?",
author = "Christopher Rhodes",
year = "2019",
month = dec,
day = "5",
language = "English",
note = "Innovation in Music, InMusic ; Conference date: 05-12-2019 Through 07-12-2019",
url = "http://inmusicconference.com",

}

RIS

TY - CONF

T1 - Viano: Electromyographic Data as a Gestural Tool for Music Composition within Game-Engines

AU - Rhodes, Christopher

PY - 2019/12/5

Y1 - 2019/12/5

N2 - In recent years, wearable sensors have allowed us to utilise previously inaccessible forms of gestural data for use within interactive music composition and live music performances. In particular, data which measures muscle tension (Electromyographic - EMG). EMG data is interesting to use because it allows for better gestural control when generating a desired sonic output (via Digital Signal Processing - DSP), in comparison to other datasets, such as Electroencephalography (EEG). As a result of this improved gestural control, game-engines can be used as a medium to investigate novel music interactions with digital environments and with virtual instruments. As game-engines have arbitrary physical laws (much different from our own real world), they are ideal to explore the use of EMG data when interacting with digital objects, as well as the resulting sonic consequences. In turn, mechanical instrument design is also less restricted. Therefore, many creative possibilities arise for interactive music composition, the way in which we interact with instruments and instrument design. Moreover, Viano is a live, interactive, game for pianists which aims to study the use of EMG data in music composition within game-engines. Through playing an acoustic piano, the performer affects the timbre of their instrument by interacting with a virtual piano. They also generate sonic material, however, when they play different components of the virtual piano (i.e. performing a 'plucking' gesture to play a virtual string). Furthermore, they are encouraged to play, create and respond to sounds which are produced via this interaction with the virtual instrument. This interaction is made possible by playing various 'stages' of the game through the use of a wearable interface (worn on their right arm) - the Myo armband. Therefore, the Viano project aims to observe: if piano components were digital - what sounds would they make? How would playing with digital piano hammers affect DSP and timbre of an acoustic piano? What extended techniques are possible through digitally augmenting the acoustic Piano instrument? What kinds of interactions will be made possible, with virtual instruments, via the use of EMG data and gestural interfaces?

AB - In recent years, wearable sensors have allowed us to utilise previously inaccessible forms of gestural data for use within interactive music composition and live music performances. In particular, data which measures muscle tension (Electromyographic - EMG). EMG data is interesting to use because it allows for better gestural control when generating a desired sonic output (via Digital Signal Processing - DSP), in comparison to other datasets, such as Electroencephalography (EEG). As a result of this improved gestural control, game-engines can be used as a medium to investigate novel music interactions with digital environments and with virtual instruments. As game-engines have arbitrary physical laws (much different from our own real world), they are ideal to explore the use of EMG data when interacting with digital objects, as well as the resulting sonic consequences. In turn, mechanical instrument design is also less restricted. Therefore, many creative possibilities arise for interactive music composition, the way in which we interact with instruments and instrument design. Moreover, Viano is a live, interactive, game for pianists which aims to study the use of EMG data in music composition within game-engines. Through playing an acoustic piano, the performer affects the timbre of their instrument by interacting with a virtual piano. They also generate sonic material, however, when they play different components of the virtual piano (i.e. performing a 'plucking' gesture to play a virtual string). Furthermore, they are encouraged to play, create and respond to sounds which are produced via this interaction with the virtual instrument. This interaction is made possible by playing various 'stages' of the game through the use of a wearable interface (worn on their right arm) - the Myo armband. Therefore, the Viano project aims to observe: if piano components were digital - what sounds would they make? How would playing with digital piano hammers affect DSP and timbre of an acoustic piano? What extended techniques are possible through digitally augmenting the acoustic Piano instrument? What kinds of interactions will be made possible, with virtual instruments, via the use of EMG data and gestural interfaces?

UR - https://www.inmusicconference.com/abstracts

M3 - Abstract

T2 - Innovation in Music

Y2 - 5 December 2019 through 7 December 2019

ER -