EMMA: Enhancing Real-Time Musical Expression through Electromyographic Control

João Coimbra; Luís Aly; Henrique Portovedo; Sara Carvalho; Tiago Bolaños

EMMA: Enhancing Real-Time Musical Expression through Electromyographic Control
Image credit: João Coimbra; Luís Aly; Henrique Portovedo; Sara Carvalho; Tiago Bolaños
  • Format: oral
  • Session: papers-3
  • Presence: remote
  • Duration: 5
  • Type: short

Abstract:

This paper presents the Electromyographic Music Avatar (EMMA), a digital musical instrument (DMI) designed to enhance real-time sound-based composition through gestural control. Developed as part of a doctoral research project, EMMA combines electromyography (EMG) and motion sensors to capture nuanced finger, hand, and arm movements, treating each finger as an independent instrument. This approach bridges embodied performance with computational sound generation, enabling expressive and intuitive interaction. The system features a glove-based design with EMG sensors for each finger and motion detection for the wrist and arm, allowing seamless control of musical parameters. By addressing key challenges in DMI design, such as action-sound immediacy and performer-instrument dynamics, EMMA contributes to developing expressive and adaptable tools for contemporary music-making.