Version
menu_open

Bridging the Game Engine Integration Gap

The ability to bridge functionality between game and audio engine authoring applications is one of the most underestimated workflow improvements available to the development team. The interaction between tools that are linked by a shared set of information can be used to drive iteration as part of a sound designer's daily tasks. It is here that the SoundFrame technology aims to bring a suite of solutions to help quickly streamline communication between Wwise and other authoring applications in an intelligent and modular way.

[Note] Designer Note

SoundFrame gives you access to most of the Sound Engine API. This allows you to enable event playback as well as modify states, switches, RTPCs, triggers, and environments in the application. This API lets you simulate real game scenarios directly in Wwise without requiring a working game engine or even having to generate SoundBanks.

Using the SoundFrame SDK, you can build plug-ins that can be integrated directly into your world building application, whether it be Unity, Unreal Editor (UnrealEd), Maya®, 3ds Max®, or any internal proprietary tool. This type of plug-in, which is built on top of the communication framework, allows you to perform many Wwise functions directly in your world building application, such as playing events, triggering game sync changes, and modifying positioning properties. You will also be able to integrate events at particular points in the animation, map switches to game textures, visualize attenuation radiuses, and assign environmental reverb to zones, among many other things.


Was this page helpful?

Need Support?

Questions? Problems? Need more info? Contact us, and we can help!

Visit our Support page

Tell us about your project. We're here to help.

Register your project and we'll help you get started with no strings attached!

Get started with Wwise