University of Leicester
Browse
WIMP2017_DiDonatoDooley.pdf (302.82 kB)

MyoSpat: A system for manipulating sound and light through hand gestures

Download (302.82 kB)
conference contribution
posted on 2020-05-26, 14:21 authored by Balandino Di Donato, James Dooley

MyoSpat is an interactive audio-visual system that aims to augment musical performances by empowering musicians and allowing them to directly manipulate sound and light through hand gestures. We present the second iteration of the system that draws from the research findings to emerge from an evaluation of the first system [1]. MyoSpat 2 is designed and developed using the Myo gesture control armband as input device and Pure Data as gesture recognition and audio-visual engine. The system is informed by human-computer interaction (HCI) principles: tangible computing and embodied, sonic and music interaction design (MiXD). This paper reports a description of the system and its audio-visual feedback design. We present an evaluation of the system, its potential use in different multi-media contexts and in exploring embodied, sonic and music interaction principles

History

Citation

Proceedings of the 3rd Workshop on Intelligent Music Production, Salford, UK, 15 September 2017

Source

3rd Workshop on Intelligent Music Production, Salford, UK, 15 September 2017

Version

  • VoR (Version of Record)

Published in

Proceedings of Workshop on Intelligent Music Production

Copyright date

2017

Notes

Available from: https://www.researchgate.net/publication/319711132_MyoSpat_A_system_for_manipulating_sound_and_light_through_hand_gestures. Accessed: 20 September 2017

Spatial coverage

Salford, UK

Temporal coverage: start date

2017-09-15

Temporal coverage: end date

2017-09-15

Language

en

Usage metrics

    University of Leicester Publications

    Categories

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC