Version
menu

Spatialization

This section explains how to apply spatialization to your mono sound events.

By the end of this section, you’ll be able to:

  • Set up your Wwise project to use the Meta XR Audio sink plug-in.

  • Route sounds into the sink plug-in to spatialize them.

  • Adjust the various spatialization parameters for each sound.

  • Understand the details of every parameter of the spatializer.

[Note]Note

Before you can spatialize sounds using the Meta XR Audio Plug-in for Wwise, make sure to set up the sink plug-in. See Getting Started.

Adding sound (SFX) files

After your busses are configured, you can add some audio files and route them to their respective busses.

To start, import a sound file as an SFX object as described in Importing media files for SFX, then use the following procedure to configure it appropriately.

To route sounds to the correct busses:

  1. Open the imported SFX object in an object tab.

  2. In the Property Editor, open the Routing tab and in the Output Bus section, locate Master Audio Bus and click the to its right.

  3. Select the audio object output bus you created in Adding child audio busses and click OK.

  4. In the Positioning tab, make sure that:

    • Listener Relative Routing is enabled.

    • 3D Spatialization is set to either Position or Position + Orientation.

    • Speaker Panning/3D Spatialization Mix is set to 100 , which ensures that sound is sent through the main mix instead of the object bus.

  5. Repeat the procedure for the passthrough and ambisonic busses, but select their respective busses instead.

Multi-channel audio assets can be audio objects, but be aware that the sink plug-in tells Wwise to split those multi-channel streams into mono audio objects and drop LFE channels if present. A stereo object asset therefore uses two of the sink’s available number of voices: a 5.1 asset uses 5 objects, and so on.

Play the sounds in the Transport Control. To verify that audio objects are routed to the endpoint’s audio stream, passthrough objects to the passthrough, and ambisonic objects to the main bus, select Master Audio Device while the SFX objects are playing. The respective meters provide a visual indication of proper routing.

[Note]Note

The default position for newly added SFX objects (as seen on the Positioning tab when the SFX is selected in the Project Explorer) is Emitter, which means the SFX is positioned in space according to its associated game object when it is triggered by the sound engine. Wwise sets its position to the origin in this situation, which means that you won’t hear any spatialization immediately.

To hear spatialization of the audio object, you can update the 3D Position to either Emitter With Automation or Listener With Automation. Click Automation, create a new path, position the audio source anywhere away from the origin, and trigger the sound again. Only save your project with these settings if your sound design requires this positioning scheme.

(Optional) Attaching the endpoint sink metadata

To control object properties like direction, distance attenuation modes, and more, you can add a metadata plug-in to the SFX or audio object bus (and in the latter case, the metadata is appended to all audio objects that flow from the bus). This metadata is preserved through the entire bus structure and remains attached to the audio object until it is rendered by the endpoint sink, which uses the metadata information to alter the way that object is rendered.

The sink plug-in observes the last metadata plug-in appended to the object in the bussing structure. If an audio object flows into a bus and both have metadata attached, the bus’s metadata is used because that was the last metadata appended. You can thus override the metadata of audio objects as they flow through Wwise’s processing graph for a flexible object rendering approach.

To add the metadata plug-in:

  1. On the Audio tab in the Project Explorer, select the SFX object.

  2. In the object tab, open the Metadata tab and click +.

  3. Select Meta XR Audio Metadata. Select the metadata plug-in you just created to view its parameters. For parameter descriptions and uses, see Parameter reference

Creating an Event to play a sound

Game developers cannot change SoundBank details directly. Instead, this must be done in Wwise Authoring. In order to trigger any of the sounds you add, you must create Events that the game can trigger.

To create an Event, add a Play action to it, and assign the SFX object you want the Event to play, see the following procedures:

We recommend that you change the Event name from "Play" to something more meaningful. The name is what the game developer uses to trigger the Event, so something descriptive is good, especially if there are many Events in your project. For example, if the Event plays a bird sound, you could call it "Play Bird".

Generating SoundBanks

To package your project into something the game developer’s sound engine can load, you must generate a SoundBank. For instructions, see Generating SoundBanks for a project.

Object-specific parameters

These object-specific parameters are attached to audio SFX objects that are routed to the audio bus. They are preserved through the processing chain and read by the endpoint sink to adjust the rendering on a per-object basis. See Object-specific parameters.

Experimental object-specific parameters

There is an extra metadata plug-in that you can attach to objects or busses to control how audio objects are rendered at the endpoint. These parameters might change or be removed in future versions of the SDK. See Experimental object-specific parameters.


Cette page a-t-elle été utile ?

Besoin d'aide ?

Des questions ? Des problèmes ? Besoin de plus d'informations ? Contactez-nous, nous pouvons vous aider !

Visitez notre page d'Aide

Décrivez-nous de votre projet. Nous sommes là pour vous aider.

Enregistrez votre projet et nous vous aiderons à démarrer sans aucune obligation !

Partir du bon pied avec Wwise