Table des matières

Using the Wwise Unity Integration

This integration provides a few components that can be used without code directly in a scene for the most frequent usage scenarios:

  • AkAmbient
    Use this component to attach a Wwise Event to any object in a scene. The sound can be started at various moments, dependent on the selected Unity trigger. This component is more useful for ambient sounds (sounds related to scene-bound objects) but could also be used for other purposes. Since AkAmbient has AkEvent as its base class, it features the play/stop, play multiple, stop multiple and stop all buttons for previewing the associated Wwise event. . See How to use AkAmbient with the inspector.
  • AkAudioListener
    Add this script on the game object that represent a listener. This is normally added to the Camera object or the Player object, but can be added to any game object when implementing 3D busses. isDefaultListener determines whether the game object will be considered a default listener - a listener that automatically listens to all game objects that do not have listeners attached to their AkGameObjListenerList's.
  • AkBank
    Loads and unloads a SoundBank at a specified moment. Vorbis sounds can be decompressed at a specified moment using the decode compressed data option. In that case, the SoundBank will be prepared.
  • AkEmitterObstructionOcclusion
    Obstructs/Occludes the emitter of the current game object from its listeners if at least one object is between them.
  • AkEnvironment
    Use this component to define a reverb zone. This needs to be added to a collider object to work properly. How to use AkEvironment and AkEvironmentPortal with the inspector (Reverb Zones).
  • AkEnvironmentPortal
    Use this component to define an area that straddles two different AkEnvironment's zones and allow mixing between both zones. How to use AkEvironment and AkEvironmentPortal with the inspector (Reverb Zones).
  • AkEvent
    Helper class that knows a Wwise Event and when to trigger it in Unity. See How to use AkAmbient with the inspector.
  • AkGameObj
    This component represents a sound object in your scene tracking its position and other game syncs such as Switches, RTPC and environment values. You can add this to any object that will emit sound, and it will be added to any object that an AkAudioListener is attached to. Note that if it is not present, Wwise will add it automatically, with the default values, to any Unity Game Object that is passed to Wwise.
  • AkRoom
    An AkRoom is an enclosed environment that can only communicate to the outside/other rooms with AkRoomPortals.
  • AkRoomPortal
    An AkRoomPortal can connect two AkRoom components together.
  • AkRoomPortalObstruction
    Obstructs/Occludes the spatial audio portal of the current game object from the spatial audio listener if at least one object is between them.
  • AkSpatialAudioEmitter
    Add this script on the GameObject which represents an emitter that uses the Spatial Audio API.
  • AkSpatialAudioListener
    Add this script on the game object that represent a listener. This is normally added to the Camera object or the Player object, but can be added to any game object when implementing 3D busses. isDefaultListener determines whether the game object will be considered a default listener - a listener that automatically listens to all game objects that do not have listeners attached to their AkGameObjListenerList's.
  • AkState
    This will call AkSoundEngine.SetState() whenever the selected Unity event is triggered. For example this component could be set on a Unity collider to trigger when an object enters it.
  • AkSurfaceReflector
    This component will convert the triangles of the GameObject's geometry into sound reflective surfaces.
  • AkSwitch
    This will call AkSoundEngine.SetSwitch() whenever the selected Unity event is triggered. For example this component could be set on a Unity collider to trigger when an object enters it.
See also:

The WwiseGlobal object

The WwiseGlobal object is a GameObject that contains the Initializing and Terminating scripts for the Wwise Sound Engine. In the Editor workflow, it is added to every scene, so that it can be properly previewed in the Editor. In the game, only one instance is created, in the first scene, and it is persisted throughout the game. There are a few customizable options in the initializer script.

If you want to disable this behavior, use Edit > Wwise Settings and uncheck "Create WwiseGlobal GameObject".

Listener in Main Camera

In order for positioning to work, the Ak Audio Listener script needs to be attached to the main camera in every scene. By default the listener is added automatically to the main camera. If you want to disable this behavior, use Edit->Wwise Settings and uncheck "Automatically add Listener to Main Camera"

Wwise Types

This integration also provides a few classes that can be used, with minimal code, for most remaining usage scenarios:

  • AK.Wwise.AuxBus

  • AK.Wwise.Bank

  • AK.Wwise.CallbackFlags

  • AK.Wwise.Event

  • AK.Wwise.RTPC

  • AK.Wwise.State

  • AK.Wwise.Switch

  • AK.Wwise.Trigger

See also:

Wwise Authoring API (WAAPI) client

A native WAAPI client with a C# API allows you to connect to WAAPI from within Unity. It currently is available for Windows and macOS. The Wwise Authoring API sends messages via JSON objects. In Unity, the client was implemented using strings. You may use your preferred method to construct valid JSON strings to then give to the WAAPI client.

See also:

How to add a Wwise sound to a game object

There are four ways to add sounds to your game:

  • Using the Wwise Picker. This is the simplest way to add a sound to an object. Drag an Event from the Wwise Picker window to an object in the Unity Viewer or the Inspector. This automatically creates an AkAmbient component on the target Game Object.
  • Using the Add Component menu. Add an AkAmbient or an AkEvent component to any Unity Game Object.
  • Using Wwise Types. Call AK.Wwise.Event.Post() at any time from a C# script.
  • Using scripts. Call AkSoundEngine.PostEvent() at any time from a C# script.

How to use AkAmbient with the inspector

  • AkAmbient:
    • Trigger On:
      Provides a list of Unity events that can trigger your event. You are not limited to those events. You can trigger an event at any time by calling AkSoundEngine.PostEvent anywhere in your code. You can also code your own triggers, so they appear in the list for your co-workers. See Adding New Triggers for Wwise Events.
    • Event Name:
      Specifies the name of the current event. To select an event, click on the current event's name to open the event picker window. Then, you can either click on an event and click on the Ok button or double click an event to select it. You can also drag an event from the Wwise picker and drop it on the current event's name to select a new one.
    • Action On Event:
      Enables users to override some event parameters defined in Wwise directly from Unity. This allows the reuse of existing events insted of creating new ones.
      • Action On Event Type:
        Overrides the event type.
      • Curve Interpolation:
        Overrides the interpolation curve.
      • Fade Time:
        Overrides the sound's fade time.
    • Use Callback:
      Provides an easy way to make a game object react to an event callback.
      • Game Object:
        The game object that will receive the callback. To select a game object, drag it from the hierarchy and drop it in the game object field.
      • Callback Function:
        This is the function that will get called by Game Object when the callback happens. To select a function, type its name in the Callback Function text field.
        For this to work, Game Object must define Callback Function in one of its components.
        The function's definition must be void FunctionName(AkEventCallbackMsg in_info).
      • Callback Flags:
        Select a flag which specifies when Callback Function will be called. More than one flag can be selected at the same time. See the AkCallbackType enumeration in the Wwise SDK documentation for more information about each flag.
    • Play / Stop:
      Can be used to preview the Wwise Event when in Edit mode.
    • Stop All:
      Stops all currently playing Wwise events.
    • Position Type:
      Defines the way the event's position will be sent to the audio engine.
      • Simple_Mode:
        The event's position will be the same as the game object to which it's attached.
      • Large_Mode:
        The event can have multiple positions that are defined by a set of points. You can add a point by clicking the Add Positioning button in the inspector. It will a child game object to the AkAmbient game object and you can use normal transform tools to move it around. This mode is useful when a sound is coming from multiple positions at the same time, the sound of water in the middle of the ocean is a good example.
      • MultiPosition_Mode:
        This mode enables us to have only one instance of a sound for all instances of AkAmbient using the same event in order to save memory.
        All AkAmbient instances that are using this mode and that have the same event will automatically get detected and the same sound instance will be used for all of them instead of loading the same sound multiple times.
        Note that all AkAmbient instances in this mode and with the same event will have the same trigger (see Trigger On in AkEvent). So, changing the trigger of one AkAmbient will automatically change the trigger of all the others with the same event.
      • Show Attenuation Sphere:
        Shows a sphere that defines the space where the sound played by an event can be heard.
        For this to work you need to enable Max Attenuation in the SoundBank settings in your Wwise project (Project->Project Settings->Soundbanks->Max attenuation)
        • Dont_Show:
          No attenuation sphere is shown.
        • Current_Event_Only:
          Shows the attenuation spheres for all the sounds that would be played after a call to AkSoundEngine.PostEvent while in the current mode.
          If in Simple_Mode, then only the attenuation sphere of the sound coming from the game object is shown.
          if in Large_Mode, then an attenuation sphere is shown for each point.
          If in MultiPosition_Mode, then an attenuation sphere is shown for every other AkAmbient in MultiPosition_Mode with the same event.
        • All_Events:
          Shows the attenuation sphere of all AkAmbient instances in the scene.

Using Wwise with Unity Timeline

For Unity's Timeline feature, there are custom Wwise tracks for triggering Wwise events and setting Wwise RTPC values.

See also:

Using the Unity WAAPI client

The Unity integration includes a simple WAAPI client that can be used to interface with the Wwise Authoring tool.

See also:

How to use AkEvironment and AkEvironmentPortal with the inspector (Reverb Zones)

In Wwise, Reverb Zones are called Environment or Auxiliary Sends. Reverb Zones are not limited to being reverb effects and are defined in the Wwise project.

An AkEnvironment component embodies a very simple environment zone. You can attach an AkEnvironment to any type of collider. To add an AkEnvironment to your scene:

  • Using the Wwise Picker. This is the simplest way to add an AkEnvironment. Drag an AuxBus from the Wwise Picker window to an object in the Unity Viewer or the Inspector. This will automatically create an AkEnvironment component on the target Game Object.
  • Using the "Add Component" menu. Add an AkEnvironment component to any Unity Game Object. Select the desired environment from the selector in the inspector.
  • Using scripts. You can call AkSoundEngine.SetGameObjectAuxSendValues() at any time from a C# script.

We also have portals which can be used to combine the effects of two environments. The contribution of each of the two environments is relative to their distance from the game object.
This is useful if a game object is standing between two rooms or in a tunnel connecting two environments.

  • To add an environment portal to your project, go to GameObject->Wwise->Environment Portal in Unity's menu bar.

To use environments and environment portals, you need a game object with an AkGameObj component that is environment-aware.
AkEnvironmentPortal objects will automatically detect AkEnvironment objects that overlap it. The overlapping environments will appear in the two select-lists in the portal's inspector. If too many environments overlap the portal, you can select which ones the portal will mix together.

In Wwise, only 4 environments can be active at the same time. Those 4 environments are selected as follows:

  • The environments that are connected to a portal and that have the highest priority are selected until we reach 4 environments or until there are no more environments connected to a portal.
  • If we still don't have 4 selected environments, we select the environments that are not connected to a portal as follows:
    • Environments with the highest priority will be selected until we reach 4 environments (if the Default and Exclude Others flags are not set).
    • A Default environment will be selected only if no other environment is selected.
    • If your game object is inside an environment with the Exclude Others flag, then it will be selected and all other environments will get discarded.
  • AkEnvironment component:
    Only 4 environments can be active at the same time.
    • Priority:
      Defines the priority of an environment.
      A smaller number has a higher priority.
      If a game object is inside more than 4 environments, only the 4 environments with the highest priority will be active (if the Default and Exclude Others flags are not set).
    • Default:
      A default environment will be active only if it's the only environment containing your game object.
      If your game object is inside more than one default environment, then only the one with the highest priority will be active.
    • Exclude Others:
      An environment with this flag can't be overlapped by other environments.
      If your game object is inside an environment with the Exclude Others flag, then all other environments will get discarded.
      If your game object is inside more than one environment with the Exclude Others flag, only the one with the highest priority will be active.
    • AuxBus Name:
      Specifies the name of the current AuxBus. To select an AuxBus, click on the current AuxBus's name to open the AuxBus picker window. Then, you can either click on an AuxBus and click on the Ok button or double-click an AuxBus to select it. You can also drag an AuxBus from the Wwise picker and drop it on the current AuxBus's name to select a new one.
  • AkEnvironmentPortal component:
    You can create an environment portal in Unity by going to GameObject->Wwise->Environment Portal.
    You can place an environment portal between two environments to combine their effects while your game object is inside the portal. The portal must intersect with both environments for this to work.
    The contribution of each of the two environments is relative to their distance from the game object. The closer the game object is from an environment, the more that environment will contribute towards the final effect.
    • Environment #1:
      The portal will automatically detect all environments that intersect the portal. Of those environments, the ones that are placed on the negative side of the portal (opposite to the direction of the chosen axis) will be available in the drop down menu. The reason the environments are sorted this way is to reduce the computation needed to determine the contribution of each environment at runtime.
    • Environment #2:
      The portal will automatically detect all environments that intersect the portal. Of those environments, the ones that are placed on the positive side of the portal (same direction as the chosen axis) will be available in the menu. The reason the environments are sorted this way is to reduce the computation needed to determine the contribution of each environment at runtime.
    • Axis:
      The axis is used to find the contribution of each environment.
      For example, if the z axis is chosen, then moving along the x axis won't have any effect on the contribution of each environment. Only movement on the z axis will have an effect on their contribution.
      Note that the axis is in object space. So, rotating the portal will also rotate the axis.
See also:

Using C# code to control the sound engine

Most Wwise SDK functions are available in Unity through the AkSoundEngine class. Think of it as the replacement of C++ namespaces AK::SoundEngine, AK::MusicEngine, and so on. See API Changes and Limitations for changes made in the API binding compared to the original SDK. For more complex situations, you'll need to call Wwise functions from code. In the API, the GameObjectID in all functions is replaced by the Unity flavor of the GameObject. At runtime, an AkGameObj component is automatically added to this GameObject, unless you have already manually added it before.

Using numeric IDs instead of strings for Events and Banks.

The native Wwise API allows you to use strings or IDs to trigger events and other named objects in the Wwise project. You can still do this in the C# world by converting the file Wwise_IDs.h to Wwise_IDs.cs. Click Assets > Wwise > Convert Wwise SoundBank IDs. You need to have Python installed to make this work.

Sending MIDI to Wwise.

MIDI can be sent to Wwise by filling the AkMIDIPost members of AkMIDIPostArray class and calling any of the following methods:

  • AkMIDIPostArray.PostOnEvent()
  • AkSoundEngine.PostMIDIOnEvent()
  • AK.Wwise.Event.PostMIDI()

The following is a basic script that sends MIDI messages to the sound engine:

public class MyMIDIBehaviour : UnityEngine.MonoBehaviour
{
    public AK.Wwise.Event SynthEvent;

    private void Start()
    {
        AkMIDIPostArray MIDIPostArrayBuffer = new AkMIDIPostArray(6);
        AkMIDIPost midiEvent = new AkMIDIPost();

        midiEvent.byType = AkMIDIEventTypes.NOTE_ON;
        midiEvent.byChan = 0;
        midiEvent.byOnOffNote = 56;
        midiEvent.byVelocity = 127;
        midiEvent.uOffset = 0;
        MIDIPostArrayBuffer[0] = midiEvent;

        midiEvent.byOnOffNote = 60;
        MIDIPostArrayBuffer[1] = midiEvent;

        midiEvent.byOnOffNote = 64;
        MIDIPostArrayBuffer[2] = midiEvent;

        midiEvent.byType = AkMIDIEventTypes.NOTE_OFF;
        midiEvent.byOnOffNote = 56;
        midiEvent.byVelocity = 0;
        midiEvent.uOffset = 48000 * 8;
        MIDIPostArrayBuffer[3] = midiEvent;

        midiEvent.byOnOffNote = 60;
        MIDIPostArrayBuffer[4] = midiEvent;

        midiEvent.byOnOffNote = 64;
        MIDIPostArrayBuffer[5] = midiEvent;

        SynthEvent.PostMIDI(gameObject, MIDIPostArrayBuffer);
    }
}

Using the Audio Input Source Plug-in in Unity.

The audio input source plug-in can be used via C# scripting. See Audio Input Source Plug-in from the Wwise SDK documentation.

The following is a basic script that sends a test tone to the audio input source plug-in:

public class MyAudioInputBehaviour : UnityEngine.MonoBehaviour
{
    public AK.Wwise.Event AudioInputEvent;
    public uint SampleRate = 48000;
    public uint NumberOfChannels = 1;
    public uint SampleIndex = 0;
    public uint Frequency = 880;
    private bool IsPlaying = true;

    // Callback that fills audio samples - This function is called each frame for every channel.
    bool AudioSamplesDelegate(uint playingID, uint channelIndex, float[] samples)
    {
        for (uint i = 0; i < samples.Length; ++i)
            samples[i] = UnityEngine.Mathf.Sin(Frequency * 2 * UnityEngine.Mathf.PI * (i + SampleIndex) / SampleRate);

        if (channelIndex == NumberOfChannels - 1)
            SampleIndex = (uint)(SampleIndex + samples.Length) % SampleRate;

        // Return false to indicate that there is no more data to provide. This will also stop the associated event.
        return IsPlaying;
    }

    // Callback that sets the audio format - This function is called once before samples are requested.
    void AudioFormatDelegate(uint playingID, AkAudioFormat audioFormat)
    {
        // Channel configuration and sample rate are the main parameters that need to be set.
        audioFormat.channelConfig.uNumChannels = NumberOfChannels;
        audioFormat.uSampleRate = SampleRate;
    }

    private void Start()
    {
        // The AudioInputEvent event, that is setup within Wwise to use the Audio Input plug-in, is posted on gameObject.
        // AudioFormatDelegate is called once, and AudioSamplesDelegate is called once per frame until it returns false.
        AkAudioInputManager.PostAudioInputEvent(AudioInputEvent, gameObject, AudioSamplesDelegate, AudioFormatDelegate);
    }

    // This method can be called by other scripts to stop the callback
    public void StopSound()
    {
        IsPlaying = false;
    }

    private void OnDestroy()
    {
        AudioInputEvent.Stop(gameObject);
    }
}

Apply Custom Positioning in Unity

By default, the AkGameObj component is attached to a specific Unity gameObject and uses its transform (with an optional offset) for full positioning. This is usually adequate for many games, such as first-person shooters. However, games with custom camera angles, such as many third-person games, may find it difficult to accommodate the two aspects of positioning (distance attenuation and spatialization) by simply attaching the audio listener to one game object, such as the main camera in Unity. Other games may want players to experience other custom positioning.

To this end, the AkGameObj component class provides overridable positioning to Unity users. Through the three virtual methods GetPosition(), GetForward(), and GetUpward(), users can derive a subclass from AkGameObj and use that subclass component to customize any number of Unity gameObjects' positioning.

Here is a simple example of how to use a custom component to override the default AkAudioListener behavior. With a third-person project integrated with Wwise, remove the existing AkAudioListener and its associated AkGameObj. Then attach the following script to the MainCamera object, attach AkAudioListener, and finally specify the target Unity gameObject (such as the player avatar) that the audio listener's position will follow. After this, the distance attenuation of all the emitters will rely on the selected target Unity gameObject's position as the listener position (an on-screen distance listener), while the orientation of all the emitters is still based on the main camera orientation as the listener orientation (an off-screen orientation listener).

#if ! (UNITY_DASHBOARD_WIDGET || UNITY_WEBPLAYER || UNITY_WII || UNITY_WIIU || UNITY_NACL || UNITY_FLASH || UNITY_BLACKBERRY) // Disable under unsupported platforms.

//
// Copyright (c) 2017 Audiokinetic Inc. / All Rights Reserved
//

using UnityEngine;
using System;
using System.Collections.Generic;


[AddComponentMenu ("Wwise/AkGameObj3rdPersonCam")]
[ExecuteInEditMode] //ExecuteInEditMode necessary to maintain proper state of isStaticObject.
public class AkGameObj3rdPersonCam : AkGameObj
{
    public Transform target;            // The position that this camera will be following. User can specify this to the player character's Unity gameObject in the Inspector.

    
    // Sets the camera position to the player's position to handle distance attenuation.
    public override Vector3 GetPosition ()
    {
        return target.GetComponent<AkGameObj> ().GetPosition ();
    }

}
#endif // #if ! (UNITY_DASHBOARD_WIDGET || UNITY_WEBPLAYER || UNITY_WII || UNITY_WIIU || UNITY_NACL || UNITY_FLASH || UNITY_BLACKBERRY) // Disable under unsupported platforms.
Generated on Fri Mar 29 16:14:27 2019 for Wwise Unity Integration by  doxygen 1.6.3