目次

Wwise Unreal Integration Documentation
UnrealでWwise Spatial Audioを使う

This tutorial presents the Spatial Audio features introduced in the Unreal Wwise integration. It explains the workflow to integrate spatial audio in a game and provides technical information about initializing spatial audio features (such as 3D busses, 3D-spatialized Sound SFX, or the Reflect Effect plug-in) in the Wwise authoring tool.

This tutorial is separated into multiple sections. Section A will help you prepare a project and each following one will talk about a different Spatial Audio feature. They can be followed independently.

注記: Generating soundbanks in sections using the Reflect plug-in require the appropriate license.

A - Preparation for the Spatial Audio Tutorials

注記: A map with the parameters for all tutorials is available as part of the Unreal Demo Game available from the Wwise Launcher. You can skip this section if you want to follow along with that map. It is called SpatialAudioTutorialMap.

A.1. Create a new project

Follow the steps below (based on the use of Wwise 2019.2 and Unreal 4.24), in order to build your working environment.

  1. Launch Unreal from the Epic launcher.
  2. Create a new Unreal Project C++ Blank project (without starter content).
  3. Unrealを閉じる
  4. Wwise Launcherを起動する。
  5. Wwise をインストールする。
  6. Unreal Engineタブを選択する。
  7. Integrate Wwise into Project...ボタンをクリックする。
  8. Open in Wwiseボタンを使ってWwiseを起動する。
  9. Open in Unrealボタンを使って、Unrealを起動する。

A.2. Wwise Project Preparation

For the tutorial, you will need a Sound SFX and an Event to play it.

  1. In the Wwise project, create a new Sound SFX in the Default Work Unit of the Actor-Mixer Hierarchy and import a sound.
    1. Make sure to enable Use game-defined auxiliary sends in the General Settings tab.
      SATutorialSoundPropertyEditorGeneralSettings.png
      Create Sound SFX
    2. In the Positioning tab, enable Listener Relative Routing, set 3D Spatialization to Position + Orientation and add an Attenuation with a Max distance of 5000.
      SATutorialSoundPropertyEditorPosition.png
      Set 3D Spatialization to Position + Orientation
  2. Right-click on the Sound SFX within the Actor-Mixer Hierarchy, then select New Event > Play.
    SATutorialEventEditor.png
    Create sound Event
  3. プロジェクトを保存する。

A.3. Unreal Project Preparation

  1. Create a floor, a building with two rooms and an obstacle outside using your preferred method. In the SpatialAudioTutorialMap, we used a custom mesh for the building and a basic cube static mesh component for the obstacle outside.
  2. Place emitters in the scene:
    1. Drag the Event created in the previous subsection from the Waapi Picker to the Content Browser.
      SATutorialWwiseWaapiPicker.png
      Drag from Wwise Picker to the Content Browser
      1. Double-click the Event in the Content Browser and create a new SoundBank directly from the SoundBank list.
        SATutorialContentBrowser.png
        Create SoundBank
        SATutorialSpatialAudioBankAsset.png
        Select SoundBank
    2. Drag the Event into the scene to create new AkAmbientSound actors.
      1. Place one of them outside and one in each room.
        SATutorialSpatialAudioTutorialMap.png
        SpatialAudioTutorialMap
  3. Open the Level Blueprint from the Blueprints menu and remove "Event BeginPlay" and "Event Tick".
    1. Trigger events with user input.
      1. Drag a newly created AkAmbientSound from the World Outliner into the blueprint.
      2. Find the "Post Associated Ak Event" function from the AkAmbientSound node.
      3. Blueprintバックグランドを右クリックして、"Left Mouse Button"を探す。
      4. Connect the Pressed outlet to the "Post Associated Ak Event" Exec.
    2. Repeat the same steps for all the AkAmbientSound items.
      SATutorialLevelBlueprint.png
      Add user input to trigger ambient sound
    3. Save and close the Level Blueprint.

A.4. Verify your Setup

  1. 上部メニューで、Build > Generate SoundBanks....を選択して、Windows用にSoundBankを生成する。
    1. Make sure the banks are successfully generated in the Output Log.
      SATutorialGenerateSoundBanks.png
      Generate SoundBanks from build menu
  2. シーンをスタートする。When pressing the respective buttons, you should now hear the sounds play, spatialized in 3D.
  3. Connect to Wwise Authoring and open the Profiler layout (shortcut F6).
    1. When playing a sound in the scene, you should see a graph similar to the following one.
      SATutorialAdvancedProfilerVoicesGraph.png
      Outside button Voices Graph

B - Reflect

In this section, we will use Spatial Audio Geometry to send reflective surfaces to the Reflect plug-in. It will simulate the early reflection implied by the propagation of sound in an acoustic environment.

B.1. Wwise Project

  1. To have access to factory Acoustic Textures as well as a preset for the early reflection Auxiliary Bus, you need to import the Reflect Factory Assets.
    1. Navigate to Project > Import Factory Assets...
    2. Choose Reflect and press OK.
  2. Create an early reflection Auxiliary Bus using the factory preset:
    1. In the Master-Mixer Hierarchy, right-click on the Master Audio Bus
    2. Navigate to New Child > Presets and select Early Reflection Auxiliary Bus
      1. In the Effects tab, double-click on the effect and
        1. set the Max Distance to at least 5,000, which is the maximum distance attenuation of the sound we set in A.2. Wwise Project Preparation
        2. set the Speed of Sound to 34,500
          SATutorialEffectEditorReflect.png
          Set Reflect Speed of Sound and Max Distance
  3. Navigate to the Sound SFX Sound Property Editor of the sound created in A.2. Wwise Project Preparation.
    1. In the General Settings tab, add the new Auxiliary Bus with the Reflect Effect under Early Reflections
      SATutorialSoundPropertyEditorGeneralSettingsEarlyReflections.png
      Enable Reflect on a Sound in Wwise
  4. プロジェクトを保存する。

B.2. Unreal Project

In the project, we want our building, the floor and the obstacle to reflect sounds. There are two ways to do that, with an AkSpatialAudioVolume or with the AkGeometryComponent.

The AkGeometryComponent can be added to static mesh actors. It will automatically send the static mesh geometry to spatial audio. It can also be configured to send the simple collision mesh. It is best to use this component for simple shapes. You don't want to send too many triangles to spatial audio, it can quickly become computationally expensive.

The AkSpatialAudioVolume is a brush volume. It has to be added and transformed manually around objects. If your static mesh is complex, you can use this volume to create a simple shape around it.

  1. In the SpatialAudioTutorialMap, the obstacle is a basic static mesh. We can easily add an AkGeometryComponent to it.
    1. Click on the actor and click on Add Component. Choose Ak Geometry.
      1. In the Geometry section, choose Simple Collision
        SATutorialAkGeometryBarrier.png
        AkGeometryComponent
  2. Repeat the same steps for the floor.
  3. In the SpatialAudioTutorialMap, the building is made out of a custom mesh. The shape is still basic enough to use an AkGeometryComponent, but for the sake of this tutorial, we will use AkSpatialAudioVolume. The only drawback of using AkSpatialAudioVolumes is that it is tedious to create openings at the position of the doors of the building.
    1. Drag and drop three AkSpatialAudioVolumes into the scene
      1. Set one around the building for the exterior walls.
      2. Set the two others around each room for the interior walls.
      3. Make sure that "Enable Surface Reflectors" is enabled for all three AkSpatialAudioVolumes.
        1. Leave "Enable Room" and "Enable Late Reverb" unchecked, we will talk about them in C - Rooms and Portals.
          SATutorialAkSpatialAudioVolumeExterior.png
          Spatial Audio Volume with Enable Surface Reflectors enabled

B.3. Verify your Setup

  1. Generate SoundBanks.
  2. Start the scene and connect to Wwise Authoring.
    1. In the Advanced Profiler's Voices Graph view, you should see a new auxiliary send with the Reflect Effect.
      1. At the initial position of the player, play the sound placed outside.
        SATutorialAdvancedProfilerVoicesGraphReflect.png
        Outside button Voices Graph with Reflect

        注記:

        If you play either of the sounds from the rooms while the player is outside, you will neither hear the sound nor see any reflect sends. That's because we created closed AkSpatialAudioVolumes around the building and each room. You can create openings in them by modifying the brush object, or use Spatial Audio Portals (seen in C - Rooms and Portals).

      2. Before going to the next step, open the Profiler Settings view and make sure Spatial Audio is enabled.
    2. Navigate to the Game Object Profiler layout (shortcut F12).
      1. Make sure you watch the player camera and the three emitters.
      2. In the Game Object 3D Viewer, you should see the different reflective surfaces.
      3. When playing a sound, early reflection rays will be drawn to show where the sound will be coming from.
        SATutorialGameObject3DViewerReflect.png
        Game Object 3D Viewer with early reflections
      4. If you can't see rays, make sure Reflection Paths is enabled in the Game Object 3D Viewer Settings.
        SATutorialGameObject3DViewerSettings.png
        Game Object 3D Viewer with early reflections

        注記:

        If you can't see any geometry in the Game Object 3D viewer, you may need to increase the Monitor Queue Pool Size. The setting is located in the 初期化設定.

C - Rooms and Portals

In a realistic acoustic environment, sounds coming from an enclosed space will come out from openings such as doors and windows. Spatial Audio Rooms and Portals will simulate this effect by emitting sounds that are played in a room only from a portal.

C.1. Wwise Project

  1. In the Wwise project, create new Auxiliary Busses for each of the rooms.
    1. Right-click where you want to add a child Auxiliary Bus
    2. Navigate to New Child > Presets and select Room Auxiliary Bus
      1. In the Effects tab, you can tweak the RoomVerb Effect.
        SATutorialAuxiliaryBusPropertyEditorEffectsSmallRoom.png
        Add a reverb effect to the Auxiliary Bus
  2. Navigate to the Sound SFX from A.2. Wwise Project Preparation.
    1. In the Positioning tab, make sure the "Enable Diffraction" checkbox is ticked.
      SATutorialSoundPropertyEditorPositioningDiffraction.png
      Enable diffraction on the sound in Wwise
  3. プロジェクトを保存する。

C.2. Unreal Project

  1. In Unreal, drag and drop the new Auxiliary Busses from the Waapi Picker to the Content Browser.
    1. Double-click them and assign them to the bank.
  2. Select the AkSpatialAudioVolumes placed on each of the rooms created in B.2. Unreal Project.
    1. Make sure the "Enable Late Reverb" and "Enable Room" are both enabled.
    2. In the Late Reverb Section, drag and drop the new imported Auxiliary Bus from the Content Browser to the Aux Bus parameters.
  3. Add two AkAcousticPortals.
    1. Place them around the openings of the building.
    2. Select the portals and set their initial state to Open in the "Ak Acoustic Portal" section.
      SATutorialAkAcousticPortal.png
      Drag acoustic portals into the scene
      注記: AkAcousticPortalの向きは、それが接続する部屋同士が、ローカルのY軸方向に沿って配置されている必要があります。When selecting a portal, a yellow ribbon appears around the portal to help visualize it. 黄色い線が、フロントエリアとバックエリアの間の境界線を示します。部屋同士が重なり合っている場合は、優先順位が最も高い部屋を選択します。

C.3. Verify your Setup

  1. Generate the SoundBanks for Windows.
  2. シーンをスタートして、スタート位置から動かない。You should now hear the sounds of both rooms when triggering them.
  3. Connect to Wwise Authoring and navigate to the Game Object Profiler layout (shortcut F12). In the Game Object 3D Viewer:
    1. You should see the new portals.
    2. Under each emitter, you should see the room in which they are placed.
    3. When moving the listener through rooms, its room will change too.
      SATutorialGameObject3DViewerRoomsAndPortal.png
      Game Object 3D Viewer using Rooms and Portals
    4. When playing a sound in a different room than the listener (with no direct line of sight):
      1. You will hear sound because portals cut through reflective surfaces.
      2. You will see early reflection paths going through the portal.
      3. You will see a sound propagation path diffracting on the edge of the portal.
        SATutorialGameObject3DViewerRoomsAndPortalAndReflect.png
        Spatial Audio paths in the Game Object 3D Viewer

        注記:

        If the world contains one or more spatial audio rooms, then the behavior of the occlusion/obstruction algorithm changes to use the additional information that spatial audio rooms provide. If the line-of-sight test between the emitter and listener fails, one of the following happens.

  • If the listener and emitter are in the same room, the Wwise obstruction filter (dry path only) is set.
  • If the emitter and listener are in different rooms, the Wwise obstruction filter (both wet and dry path) is set.
  • In the absence of spatial audio rooms, the algorithm assumes that all sounds that do not have a line of sight to the listener are occluded, and the Wwise occlusion filter (both wet and dry path) is set.

In Wwise, you can fine-tune the filter response of the portal shadow region under the Obstruction/Occlusion tab in Project Settings.

SATutorialProjectSettingsObstructionVolume.png
Obstruction volume curve
SATutorialProjectSettingsObstructionLPF.png
Obstruction LPF curve

C.4. Portals and Reverb

Sound emitting through Portals can reverberate into the room the listener is currently in. These steps show how it was configured in the Spatial Audio Tutorial map.

  1. In the Wwise project,
    1. Find the Auxiliary Bus used for the Room Reverb. The room where sound is emitted from that we want to feed to other room reverbs.
      1. In the General Settings of the Auxiliary Bus Property Editor, make sure use game-defined auxiliary sends is enabled.
        SATutorialAuxiliaryBusPropertyEditorGeneralSettingsRoom.png
        Enable 'Use game-defined auxiliary sends' of a room Auxiliary Bus.
  2. Generate Soundbanks, start the scene and connect to the Wwise Authoring.
  3. Play the sound in the room with enabled auxiliary sends and navigate to a communicating room.
    1. You should see that the wet part of the emitting sound will also feed the reverb of the room the listener is in.
      SATutorialAdvancedProfilerVoicesGraphPortalReverb.png
      SmallRoom reverb feeds into LargeRoom reverb.

C.5. Room Tones

Sometimes, rooms have a specific ambient sound like the buzzing sound of air conditionning. To recreate this, you can post an event on the Spatial Audio Room game object. When the listener will be in the room, the sound will be positionned at its location. When in a different room, the listener will hear the room tone from the portals connecting the room to the position of the listener.

  1. In the Wwise project,
    1. Create new Sound SFX for the room tone.
      1. Enable 'Use game-defined aux sends' if you want the sound to send to reverb
      2. Add an attenuation, if desired, for distance attenuation curves.
    2. Create a play event with the room tone by right-clicking on the Sound SFX, then selecting New Event > Play.
  2. In Unreal,
    1. Drag the Event created in the previous subsection from the Waapi Picker to the Content Browser.
    2. Add this event to the Ak Audio Event parameter, under the Ak Event section, of one of your rooms.
      1. Adjust the Aux Send Level to feed some of the sound to the reverb of the room.
      2. You can choose to check the Auto Post box to post the room tone event on BeginPlay, or you can call the same blueprint functions usually used to post events on game objects.
        SATutorialLargeRoomAkAudioEvent.png
        The AkEvent section of a Spatial Audio Volume with Room enabled.
    3. In the Spatial Audio Tutorial map, we used blueprint functions in the Level Blueprint to activate and deactivate the room tone.
      1. While having the AkSpatialAudioVolume with a room tone selected in the World Outliner, right-click in the Level Blueprint to create a reference.
      2. Drag a connection from the reference and search for "Post Associated Ak Event".
      3. In the same way, search for the Stop function.
      4. Add a key press as an input node.
        SATutorialLevelBlueprintRoomTone.png
        Play and stop a room tone from the Level Blueprint
  3. Generate SoundBanks.
  4. Start the scene and connect to Wwise Authoring.
  5. Navigate to the room with a room tone and press the key to start the room tone. Verify that you can hear it.
    1. In the Advanced Profiler view, you should see the event being played.
      SATutorialRoomToneAdvancedProfiler.png
      Advanced Profiler view when playing a room tone
    2. In the 3D Game Object Viewer, you can watch the room game object as you move the listener around.
      1. If the listener is in the room, the room will emit at the position of the listener. You will see the room game object follow the listener game object.
      2. If the listener is in a different room, the room game object will be placed at the portal and a path will be drawn between it and the listener game object.
        SATutorialRoomToneGameObject3DViewer.png
        Watch the room game object in the Game Object 3D Viewer

D - Diffraction

When the line of sight between the emitter and the listener is obstructed by an object, Spatial Audio can create diffraction paths that will go around the object and simulate a realistic behavior. Depending on the angle of the path around an edge, the sound will be attenuated with obstruction.

注記:

When using Spatial Audio diffraction, disable unreal engine side obstruction/occlusion. Set the Occlusion Refresh Interval of your emitters (AkComponent) to 0.

In the SpatialAudioTutorialMap, we can add Spatial Audio diffraction around the obstacle outside and the exterior walls of the building.

  1. In the Wwise project, make sure that diffraction is enabled on the Sound SFXs that are emitted by the emitters we want to use diffraction for.
    1. Enable Diffraction in the Positioning tab of the Sound Property Editor.
  2. In Unreal,
    1. Click on each AkAmbientSound actors that will be emitting diffraction enabled sounds.
      1. Set Occlusion Refresh Interval to 0.
    2. Click on the AkSpatialAudioVolume for the exterior walls of the building.
      1. In the Acoustic Surface Properties section, enable Diffraction by checking the checkbox.
    3. Click the obstacle static mesh outside.
      1. In the Geometry section, under Diffraction, enable Diffraction by checking the checkbox.
  3. Generate SoundBanks.
  4. Play and connect to Wwise.
    1. You should be able to see diffraction edges on the diffraction-enabled geometries and diffraction paths when a diffraction-enabled sound is played while it is obstructed from the listener.
      SATutorialGameObject3DViewerDiffraction.png
      Diffraction paths in the Game Object 3D Viewer

E - Transmission

When an object appears between the emitter and the listener, the sound can also pass through the object. Spatial Audio models this phenomena by applying occlusion filtering on the direct path of the sound. Occlusion values ranging from 0 to 1 can be applied on an AkSpatialAudioVolume or AkGeometry components. The occlusion value is then sent to Wwise where the associated filter is applied according to the occlusion curves of the project. Typically, a value of 1.0 represents full occlusion, and a value of 0.0 indicates that sound can be transmitted through the geometry.

  1. Enable transmission in the spatial audio init settings
    SATutorialInitializationSpatialAudioSettings.png
    Spatial Audio Init Settings
  2. The transmission value of a sound is the maximum occlusion value of spatial audio geometries or rooms that the sound encounters in a direct path from the emitter to the listener.
    1. On AkSpatialAudioVolume components,
      1. an occlusion value can be associated with each acoustic surface, if the component enables surface reflectors
        SATutorialAkSpatialAudioVolumeAcousticSurfaceOcclusion.png
        Occlusion Values for each acoustic surface
      2. an occlusion value can be associated to the walls of the room in Wall Occlusion, if the component enables room.
        SATutorialAkSpatialAudioVolumeWallOcclusion.png
        Wall Occlusion of the Room
    2. On AkGeometry components,
      1. the occlusion value can be overriden in the Acoustic Properties Override parameter
        1. for the collision mesh
          SATutorialAkGeometryAcousticPropertiesOverrideCollision.png
          Override the occlusion value of the simple collision mesh in the AkGeometry component
        2. or for each material of the static mesh
          SATutorialAkGeometryAcousticPropertiesOverrideStatic.png
          Override the occlusion value of the static mesh in the AkGeometry component
      2. the occlusion value can also be set per Physical Materials in the AkGeometry Surface Properties Map of the Integration設定
        SATutorialAkGeometrySurfacePropertiesMap.png
        Associate occlusion values to Physical Materials
  3. Adjust the occlusion curves in Wwise Authoring in the Obstruction/Occlusion tab of the Project Settings.
    SATutorialProjectSettingsOcclusionVolume.png
    Occlusion volume curve
    SATutorialProjectSettingsOcclusionLPF.png
    Occlusion LPF curve