In recent years, wearable sensors have allowed us to utilise previously inaccessible forms of gestural data for use within interactive music composition and live music performances. In particular, data which measures muscle tension (Electromyographic - EMG). EMG data is interesting to use because it allows for better gestural control when generating a desired sonic output (via Digital Signal Processing - DSP), in comparison to other datasets, such as Electroencephalography (EEG). As a result of this improved gestural control, game-engines can be used as a medium to investigate novel music interactions with digital environments and with virtual instruments. As game-engines have arbitrary physical laws (much different from our own real world), they are ideal to explore the use of EMG data when interacting with digital objects, as well as the resulting sonic consequences. In turn, mechanical instrument design is also less restricted. Therefore, many creative possibilities arise for interactive music composition, the way in which we interact with instruments and instrument design.
Moreover, Viano is a live, interactive, game for pianists which aims to study the use of EMG data in music composition within game-engines. Through playing an acoustic piano, the performer affects the timbre of their instrument by interacting with a virtual piano. They also generate sonic material, however, when they play different components of the virtual piano (i.e. performing a 'plucking' gesture to play a virtual string). Furthermore, they are encouraged to play, create and respond to sounds which are produced via this interaction with the virtual instrument. This interaction is made possible by playing various 'stages' of the game through the use of a wearable interface (worn on their right arm) - the Myo armband. Therefore, the Viano project aims to observe: if piano components were digital - what sounds would they make? How would playing with digital piano hammers affect DSP and timbre of an acoustic piano? What extended techniques are possible through digitally augmenting the acoustic Piano instrument? What kinds of interactions will be made possible, with virtual instruments, via the use of EMG data and gestural interfaces?