Sonicolour: Exploring Colour Control of Sound Synthesis with Interactive Machine Learning

Tug F. O'Flaherty; Luigi Marino; Charalampos Saitis; Anna Xambó Sedó

Sonicolour: Exploring Colour Control of Sound Synthesis with Interactive Machine Learning

Abstract:

This paper explores crossmodal mappings of colour to sound. The instrument presented analyses the colour of physical objects via a colour light-to-frequency sensor and maps the corresponding red, green and blue data values to parameters of a synthesiser. Interactive machine learning is used to facilitate the discovery of new relationships between sound and colour. The role of interactive machine learning is to find unexpected relationships between the visual features of the objects and the sound synthesis.The performance is evaluated by its ability to provide the user with a playful interaction between the visual and tactile exploration of coloured objects, and the generation of synthetic sounds. We conclude by outlining the potential of this approach for musical interaction design and music performance.