University of Leicester
Browse
- No file added yet -

Unconstrained and constrained embodied interaction with audiovisual feedback during vocal performance

Download (263.31 kB)
conference contribution
posted on 2020-09-23, 10:42 authored by Balandino Di Donato
This paper presents the work on unconstrained and constrained embodied interaction with live audiovisual processing parameters during singing. Building upon the concept of affordance and embodiment and adopting a User-Centred Design approach, two case studies were realised. The first case study in a context where a performer is free to move and interact with the MyoSpat interactive system for live sound processing parameters (unconstrained interaction); and, the second in a performative situation where the musician is limited by a played instrument or interface (constrained interaction). The interaction design solution proposed in the two case studies was welcomed by the performers; its potential and limitation allowed invited the exploration of new gestures-sound relationships.

History

Citation

Proceedings Sound, Image and Interaction Design Symposium (SIIDS) 2020

Author affiliation

School of Informatics

Source

Sound, Image and Interaction Design Symposium 2020

Version

  • VoR (Version of Record)

Published in

Proceedings Sound, Image and Interaction Design Symposium (SIIDS) 2020

Publisher

SIDDS

Acceptance date

2020-07-31

Copyright date

2020

Available date

2020-09-04

Spatial coverage

Funchal, Madeira (Portugal)

Temporal coverage: start date

2020-09-04

Temporal coverage: end date

2020-09-04

Language

en

Usage metrics

    University of Leicester Publications

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC