Eluding the Physical Constraints in a Nonlinear Interaction Sound Synthesis Model for Gesture Guidance
Authors: Thoret E., Aramaki M., Gondre C., Ystad S., Kronland-Martinet R.
Publication Date: June 2016
Journal: Applied Sciences (vol. 6(7), 192, 2016)
In this paper, a flexible control strategy for a synthesis model dedicated to nonlinear friction phenomena is proposed. This model enables to synthesize different types of sound sources, such as creaky doors, singing glasses, squeaking wet plates or bowed strings. Based on the perceptual stance that a sound is perceived as the result of an action on an object we propose a genuine source/filter synthesis approach that enables to elude physical constraints induced by the coupling between the interacting objects. This approach makes it possible to independently control and freely combine the action and the object. Different implementations and applications related to computer animation, gesture learning for rehabilitation and expert gestures are presented at the end of this paper.