Expression Synthesis Project

The Expression Synthesis Project (ESP) involves a driving interface for expression synthesis, making high-level expressive musical decisions accessible to nonexperts. The user drives a car on a virtual road that represents the music with its twists and turns, and makes decisions on how to traverse each part of the road. The driver’s decisions affect the rendering of the piece in real time.

The pedals and wheel provide a tactile interface for controlling the dynamics and musical expression, while the display portrays a first-person view of the road and dashboard from the driver’s seat. This game-like interface allows nonexperts to create expressive renderings of existing music without having to master an instrument, and allows expert musicians to experiment with expressive choice without having to first master the notes of the piece.

This according to “ESP: A driving interface for expression synthesis” by Elaine Chew, Alexandre François, Jie Liu, and Aaron Yang, an essay included in the conference report NIME-05: New interfaces for musical expression (Vancouver: University of British Columbia Media and Graphics Interdisciplinary Centre, 2005, pp. 224–227). Click here for film and midi demonstrations of ESP.

Related article: Singing and safety

1 Comment

Filed under Performance practice, Science

One Response to Expression Synthesis Project

  1. Pingback: Singing and safety | Bibliolore