Perception-Based Interactive Sound Synthesis of Morphing Solids’ Interactions


Authors: Pruvost L., Scherrer B., Aramaki M., Ystad S., Kronland-Martinet R.
Publication Date: December 2015
Journal: ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2015 (ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2015 Volume 34 Issue 6, November 2015)

Tags: , , , ,


Abstract

This brief introduces a novel framework for the interactive and realtime synthesis of solids’ interaction sounds driven by a game engine. The sound synthesizer used in this work relies on an action-object paradigm, itself based on the notion of perceptual invariants. An intuitive control strategy, based on those invariants and inspired by physics, was developed. The action and the object can be controlled independently, simultaneously, and continuously. This allows the synthesis of sounds for solids’ interactions whose nature evolves continuously over time (e.g. from rolling to slipping) and/or where the objects’ properties (shape, size and material) vary continuously in time.

 


 

Introduction

This brief introduces a novel framework for the interactive and realtime synthesis of solids’ interaction sounds driven by a game engine. This work situates itself within the growing research on procedural audio generation, which aims at replacing the use of prerecorded audio samples. The sound synthesizer at the core of this work is based on an action-object paradigm whereby sounds are described as the result of an action on an object [Gaver 1993]. This paradigm assumes the existence of auditory invariants, i.e. perceptually relevant signal morphologies that carry information about the action and/or the object involved in sound production. In this context, an intuitive control of sound synthesis was developed [Aramaki et al. 2011; Conan et al. 2014]. This control allows for continuous navigation in two different spaces defining action’s and object’s properties. It is inspired by physics, however it is not intended to be very accurate in that regard. Indeed, the actual goal is perceptual relevance.

It is both the interactive nature of the framework – the sound synthesizer runs in real-time – and its ability to smoothly morph between different actions and objects that differentiates this work from previous work on the control of modal synthesis by game engine collision events [Van Den Doel et al. 2001; Zhimin et al. 2010] or on the very accurate modeling of non-linear sound production of virtual objects [Chadwick et al. 2009]. Compared to [Verron et al. 2013], the new framework includes recent developments in the modeling of linear interactions [Conan et al. 2014] and non-linear coupling [Thoret et al. 2013]. In addition, the synthesizer’s control has been overhauled. In particular, a new control space defining the object’s properties has been introduced.

 

Demonstration

The following video demonstrates interactive sound synthesis driven by a game engine. It features 5 sections that illustrate our framework’s capabilities when it comes to interactive action morphings (rolling/sliding in section 1, squeaking/singing in section 2) and object morphings (material morphing in section 3, size morphing in section 4). The section 5 features a more elaborate environment (a labyrinth game) which offers a global overview of the framework’s capabilities.

 

 

 

Conclusion

We present a novel framework for the interactive and real-time synthesis of solids’ interaction sounds driven by a game engine. The models and the sound synthesis engines associated to different interactions were developed in previous studies. The control of the synthesizer was adapted to enable its connection to a game engine.

 

Contact

For any questions regarding this work, please mail : pruvost@lma.cnrs-mrs.fr

 

References

ARAMAKI, M., BESSON, M., KRONLAND-MARTINET, R., AND YSTAD, S. 2011. Controlling the Perceived Material in an Impact Sound Synthesizer. IEEE Transactions on Audio, Speech and Language Processing 19, 2 (February), 301–314.

CHADWICK, J. N., AN, S. S., AND JAMES, D. L. 2009. Harmonic shells: A practical nonlinear sound model for near-rigid thin shells. In ACM Transaction on Graphics (SIGGRAPH ASIA Conference Proceedings).

CONAN, S., THORET, E., ARAMAKI, M., GONDRE, C., YSTAD, S., KRONLAND-MARTINET, R., AND DERRIEN, O. 2014. An intuitive synthesizer of continuous interaciton sounds: Rubbing, scratching and rolling. Computer Music Journal 38, 4, 24–37.

GAVER, W. W. 1993. How do we hear in the world? Explorations in ecological acoustics. Ecological psychology 5, 4, 285–313.

THORET, E., ARAMAKI, M., GONDRE, C., KRONLANDMARTINET, R., AND YSTAD, S. 2013. Controlling a non linear friction model for evocative sound synthesis applications. In International Conference on Digital Audio Effects (DAFx), Maynooth, Ireland, Sept. 2013.

VAN DEN DOEL, K., KRY, P. G., AND PAI, D. K. 2001. FoleyAutomatic: physically-based sound effects for interactive simulation and animation. In Proceedings of the 28th annual conference on Computer graphics and interactive techniques, ACM, 537–544.

VERRON, C., ARAMAKI, M., GONOT, A., SCOTTI, T., RAKOVEC, C.-E., MINGASSON, A., AND KRONLAND-MARTINET, R. 2013. Event-driven interactive solid sound synthesis. In Proceedings of the 10th International Symposium on Computer Music Multidisciplinary Research.

ZHIMIN, R., HENGCHIN, Y., AND LIN, M. C. 2010. Synthesizing contact sounds between textured models. In Proceedings of the 2010 IEEE Virtual Reality Conference, IEEE Computer Society, vol. 139-146.