Loading
Conference Proceedings

EMMA: Enhancing Real-Time Musical Expression through Electromyographic Control

João Coimbra (INET-md), Luís Aly (ID+), Henrique Portovedo (INET-md), Sara Carvalho (INET-md) e Tiago Almeida Bolaños (IST) são autores do artigo em acesso aberto “EMMA: Enhancing Real-Time Musical Expression through Electromyographic Control“, recentemente publicado nas Atas da Conferência Internacional New Interfaces for Musical Expression.

Abstract

This paper presents the Electromyographic MusicAvatar (EMMA), a digital musical instrument (DMI) designed to enhance real-time sound-based composition through gestural control. Developed as part of a doctoral research project, EMMA combines electromyography (EMG) and motion sensors to capture nuanced finger, hand, and arm movements, treating each finger as an independent instrument. This approach bridges embodied performance with computational sound generation, enabling expressive and intuitive interaction. The system features a glove-based design with EMG sensors for each finger and motion detection for the wrist and arm, allowing seamless control of musical parameters. By addressing key challenges in DMI design, such as action-sound immediacy and performer-instrument dynamics, EMMA contributes to developing expressive and adaptable tools for contemporary music-making.

Keywords

Gestural interface; Physiological signals; Digital Musical Instrument; Human-Computer Interaction; Composition; Performance.