Version
menu_open
Wwise Unity Integration Documentation
Wwise Demo Game

Demonstration scenes, containing the Wwise Unity Integration, are available to download from the Wwise Launcher Unity page under the contextual menu in the "Recent Unity Projects" title. These simple scenes, created using Unity's standard assets exclusively, demonstrate how to use some of the Integration's basic features.

Installation

The Wwise Demo Game is a standalone project. It is available to download in the Wwise Launcher. It should not be used as a foundation for your game. Use the Wwise Launcher to integrate Wwise in a new Unity project for this purpose.

Note:
  • The generated SoundBanks are included in the package.
  • You can find the Wwise Project associated with the scenes under <DEMO_SCENE_ROOT>/WwiseProject. Leaving the Wwise project in the game's Assets folder is not recommended, but it was necessary in this Demo for packaging purposes.

Deployment

The Wwise Demo Game is intended as a way to preview and show how the Wwise Unity Integration can be used in the Unity Editor.

To deploy one of the Wwise Demo Game scenes to a game console or mobile device, follow these steps:

  1. Within the Launcher's Unity tab, select the Modify Wwise in Project... option from your Wwise Demo Game Unity Project.
    1. The Launcher's Unity integration page opens.
  2. Add the desired Deployment Platforms and click Modify.
  3. Open the updated version of the project in Unity.
  4. Generate the SoundBanks for the platform.
  5. Copy the Generated SoundBanks folder to the StreamingAssets/Audio folder.
  6. Within Unity, build the scene for your desired platform then deploy to the device.

Wwise Demo Scene

This first-person 3D map contains "stations" along a path. For each station, there is a small description on a sign next to the station. For easy reference, each station's assets have been nested in the scene hierarchy.

Footsteps demo

The footpath, along with the first station, demonstrates how to use scripting and trigger volumes to create a footstep system.

In the Wwise Project, footsteps have been implemented in the recommended way: Random containers for each surface type, nested under a Switch Container. A footstep can be posted using the Footstep event, and the surface material is controlled with the Footstep_material switch group.

The scene's terrain has been painted with four different textures: grass, gravel, wood, and dirt. Walking around the different surface types changes the footstep sound accordingly.

To accomplish this, Box Colliders have been placed over each zone. Setting the Footstep_Material switch is then done simply by dragging a Switch Value from the Wwise Picker Window to a Box Collider. In order to set the switch on the First Person Controller when it enters the collider, we need to trigger the Ak Switch script on the "AkTriggerEnter" trigger, and make sure the "Use Other Object" check box is enabled (as seen in the Inspector Window).

A very rudimentary footstep script has been implemented, which simply posts the Footstep event every 0.3 seconds when the player is moving. This script has been attached to the First Person Controller. This illustrates a way to post an event to the Wwise SoundEngine via scripting.

Subtitle demo

This station illustrates how to create custom event triggers in Unity, as well as using event callbacks.

A simple button script contains a delegate to be run when the player is close enough to the button, and presses a key on the keyboard or controller. Another script, AkTriggerButtonPress, registers itself on that delegate, and then calls triggerDelegate from its parent class AkTriggerBase. Inheriting from AkTriggerBase allows the custom trigger to show up in the "trigger" list in Wwise Component Inspector windows (for example, the Ak Ambient attached to the button). For more information on custom triggers, see Adding New Triggers for Wwise Events.

The event starts playing a sound file containing WAV markers. Registering the marker callbacks associated with the event allows the updating of the subtitle on the panel. To achieve this, in the Ak Ambient's inspector, the "Use Callback" option was checked. Then the GameObject (SubtitleSign), containing a script (SubtitleDemo.cs), was dragged & dropped onto the "Game Object" box. The name of the Callback Function (MarkerCallback), to execute when the callback is triggered, was typed and "Marker" was chosen as the Callback Flags. Looking at SubtitleDemo's code, it can be seen that the Callback function is simply reading the subtitles from a pre-defined string array, using the MarkerCallback's uIdentifier field as an index. See How to use AkAmbient with the inspector for more information on callbacks.

Environment demo

This station demonstrates how to apply effects on sound within a zone in the scene. Two caves contain an EnvironmentZone (once again, a Box Collider is acting as a trigger) on which an AuxBus has been added (Dragged & Dropped from the Wwise Picker window).

In Wwise, two Auxiliary Busses have been created, each containing a different effect. Furthermore, the footsteps and the Little Sequence have "Use game-defined Auxiliary sends" checked.

When a Wwise event is posted from within the EnvironmentZone, its sound is routed to the AuxBus associated to the zone.

This station also demonstrates the Environment Portal component. This allows a spatial transition between two environments. The three buttons in this demonstration are used to illustrate how an Environment Portal affects the mixing of the Auxiliary Busses.

To create an environment portal, first create a Box Collider that touches two Ak Environments on one of its axes (in this demonstration, the portal touches the Red zone and the Blue zone on the 'z' axis). Then add the Ak Environment Portal component, and select the correct axis. The two environments, which will be mixed, will be automatically filled in.

For more information on Environments and Environment Portals, see How to use AkEvironment and AkEvironmentPortal with the inspector (Reverb Zones).

Timeline Demo

This station demonstrates the use of an AkEventTrack and an AkRTPCTrack within a Timeline in Unity. There are two cubes, which, when the button is pressed, are animated towards each other and stopped when they reach impact.

In order to inspect the Timeline, select Window->Timeline. Then in the Hierarchy, expand the TimelineDemo object, and then expand the Timeline Demo Button object. Then select the Button object. The Timeline editor will now show the Timeline that controls the animation and Wwise Events for the cubes. The Timeline is controlled by the Playable Director component of the Button object.

The Timeline contains two animation tracks, one for each cube. These tracks are used to animate the z position of each cube over time. The Timeline also contains an AkEventTrack and an AkRTPCTrack, marked by a white and red tab, respectively. The AkEventTrack contains two AkEventPlayable clips: the first triggers the PlayCubeMovement Wwise Event, and the next triggers the PlayImpact Wwise Event. The names of these Wwise Events are displayed in the clips. You may need to increase the size of the Timeline editor view and zoom in in order for the name to be displayed correctly. The PlayCubeMovement Wwise Event plays a sine wave source, which has an RTPC affecting its pitch. The name of this RTPC is CubeAcceleration. The AkRTPCTrack increases this RTPC over time as the cubes move towards each other. This causes the pitch of the sine wave to increase, producing a simple sound effect to indicate the acceleration of each cube towards the other.

This station also demonstrates the Motion feature. Note that there is a game object called "Motion listener" in the player hierarchy. This game object set up the output for the Motion device on supported platforms. Any output needs a set of listeners to receive data. That is why the Motion listener also have a Ak Game Object and a Ak Audio Listener. To enable the Motion effect on the cube impact, two important things needs to be done. First, the impact sound needs to be routed to an output bus using the Wwise Motion ShareSet in the Wwise project. Second, the listener used for the Motion output needs to be added to the listeners of the emitter posting the impact event. Inspect the AkMotionListener script for an example on how to add an output.

Note: To support the Motion feature on android, the manifest of the application must include the vibration permission. Unity generates automatically the manifest based on the content of the application. Adding a call to Handheld.Vibrate() will add the desired permission in the manifest.

For more information on the Timeline integration, see Wwise Timeline Integration.

Spatial Audio Scene

This scene is the final product of the Spatial Audio Tutorial.


Was this page helpful?

Need Support?

Questions? Problems? Need more info? Contact us, and we can help!

Visit our Support page

Tell us about your project. We're here to help.

Register your project and we'll help you get started with no strings attached!

Get started with Wwise