SIIDS_2020_paper_26 (1).pdf (263.31 kB)
Unconstrained and constrained embodied interaction with audiovisual feedback during vocal performance
conference contribution
posted on 2020-09-23, 10:42 authored by Balandino Di DonatoThis paper presents the work on unconstrained and constrained embodied interaction with live audiovisual processing parameters during singing. Building upon the concept of affordance and embodiment and adopting a User-Centred Design approach, two case studies were realised. The first case study in a context where a performer is free to move and interact with the MyoSpat interactive system for live sound processing parameters (unconstrained interaction); and, the second in a performative situation where the musician is limited by a played instrument or interface (constrained interaction). The interaction design solution proposed in the two case studies was welcomed by the performers; its potential and limitation allowed invited the exploration of new gestures-sound relationships.
History
Citation
Proceedings Sound, Image and Interaction Design Symposium (SIIDS) 2020Author affiliation
School of InformaticsSource
Sound, Image and Interaction Design Symposium 2020Version
- VoR (Version of Record)
Published in
Proceedings Sound, Image and Interaction Design Symposium (SIIDS) 2020Publisher
SIDDSAcceptance date
2020-07-31Copyright date
2020Available date
2020-09-04Spatial coverage
Funchal, Madeira (Portugal)Temporal coverage: start date
2020-09-04Temporal coverage: end date
2020-09-04Language
enPublisher version
Usage metrics
Categories
No categories selectedLicence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC