An Intuitive Synthesizer of Sustained Interaction Sounds
Authors: Conan S., Thoret E., Gondre C., Aramaki M., Kronland-Martinet R., Ystad S.
Publication Date: October 2013
Journal: 10th International Symposium on Computer Music Multidisciplinary Research (Marseille, France, October 15-18, pp. 1045-1050, 2013)
This research, based on the action/object paradigm that proposes that sounds result from an action on an object, focuses on the synthesis of sustained interaction sounds: rubbing, scratching, rolling and nonlinear friction sounds. Thanks to the underlying signal models which are highly controllable, the proposed synthesizer allow the definition of objects and interactions properties from an intuitive graphical interface. The synthesized sounds are controlled in real time by the user’s gesture thanks to external controllers and physically informed mappings.