Version
menu_open

Integrating Audio in your Game

Before jumping into the code and using the Wwise SDK, you should understand the unique approach Wwise uses for building and integrating audio into your game. There are also a few concepts that you should be familiar with in order to work efficiently and get the most out of Wwise.

The Wwise approach to building and integrating audio in a game includes five main components:

  • Audio Structures

  • Events

  • Game Syncs

  • Game Objects

  • Listeners

Each of the components will be discussed in further detail in the following sections, but before moving on, you should understand where each of these components fits in and how they relate to one another.

One of the goals of Wwise was to create a clear distinction between the tasks of the programmer and those of the designer. For example, the audio structures, which represent the individual sounds in your game are created and managed exclusively within the Wwise application by the sound designer. Game objects and listeners, on the other hand, which represent specific game elements that emit or receive audio, are created and managed within the game by the programmer. The final two components, Events and Game Syncs, are used to drive the audio in your game. These two components create the bridge between the audio assets and the game components and are therefore integral to both Wwise and the game.

The following illustration demonstrates where each of these components are created and managed.


Was this page helpful?

Need Support?

Questions? Problems? Need more info? Contact us, and we can help!

Visit our Support page

Tell us about your project. We're here to help.

Register your project and we'll help you get started with no strings attached!

Get started with Wwise