Unconstrained and constrained embodied interaction with audiovisual feedback during vocal performance
conference contributionposted on 2020-09-23, 10:42 authored by Balandino Di Donato
This paper presents the work on unconstrained and constrained embodied interaction with live audiovisual processing parameters during singing. Building upon the concept of affordance and embodiment and adopting a User-Centred Design approach, two case studies were realised. The first case study in a context where a performer is free to move and interact with the MyoSpat interactive system for live sound processing parameters (unconstrained interaction); and, the second in a performative situation where the musician is limited by a played instrument or interface (constrained interaction). The interaction design solution proposed in the two case studies was welcomed by the performers; its potential and limitation allowed invited the exploration of new gestures-sound relationships.
CitationProceedings Sound, Image and Interaction Design Symposium (SIIDS) 2020
Author affiliationSchool of Informatics
SourceSound, Image and Interaction Design Symposium 2020
- VoR (Version of Record)