posted on 2020-09-23, 10:42authored byBalandino Di Donato
This paper presents the work on unconstrained and constrained embodied interaction with live audiovisual processing parameters during singing. Building upon the concept of affordance and embodiment and adopting a User-Centred Design approach, two case studies were realised. The first case study in a context where a performer is free to move and interact with the MyoSpat interactive system for live sound processing parameters (unconstrained interaction); and, the second in a performative situation where the musician is limited by a played instrument or interface (constrained interaction). The interaction design solution proposed in the two case studies was welcomed by the performers; its potential and limitation allowed invited the exploration of new gestures-sound relationships.
History
Citation
Proceedings Sound, Image and Interaction Design Symposium (SIIDS) 2020
Author affiliation
School of Informatics
Source
Sound, Image and Interaction Design Symposium 2020
Version
VoR (Version of Record)
Published in
Proceedings Sound, Image and Interaction Design Symposium (SIIDS) 2020