Blog homepage

How to use AudioLink in Unreal Engine

Wwise Tips & Tools

Introduction

In this article, we focus on Unreal Engine's AudioLink.

The operation has been confirmed using the following versions:

  • Wwise 2023.1.3.8471
  • Unreal Engine 5.3.2

Please note that different versions may operate differently.

Table of Contents

Introduction
Table of Contents
What is AudioLink?
Diagram: How Does It Work?
Preparing for a new project
  Integration and project creation
AudioLink - Setup
  Wwise Project Settings
  Configuring Unreal Integration
  Unreal Settings (Optional)
    Sound Attenuation Settings
    Sound Submix Settings​
AudioLink - Playing the sound
  When specifying Sound Attenuation (Blueprint node)
  When specifying Sound Attenuation (Audio Component)
  When specifying Wwise AudioLink Settings
  Sound Submix
Conclusion

What is AudioLink?

AudioLink is an Unreal Engine feature available in Unreal Engine 5.1 onwards that  allows Unreal Audio Engine to be used alongside middleware.

This allows you to simultaneously use UE's own sound solutions, such as MetaSounds, in combination with audio middleware like Wwise.

Diagram: How Does It Work?

Cover_EN

Starting from the right, on the Wwise side, the Audio Input plug-in receives the output from AudioLink, and the Wwise AudioLink Settings manage the AudioLink output to Wwise. And the Sound Attenuation holds the settings for the Wwise AudioLink Settings. (Note: Here, Sound Attenuation is an Unreal asset from their Audio system. AudioLink only works if your Unreal sound is using a Sound Attenuation.)

In the Unreal Engine, there are some Blueprint nodes that allow you to specify Sound Attenuation as the output method, and some that do not allow you to specify anything.

Audio Components such as Sound Cue and MetaSound, allow you to  specify Sound Attenuation, as well as Wwise AudioLink Settings directly.

You can also route the output to the Sound Submix in the Wwise AudioLink Settings. The Sound Submix can capture most audio output within Unreal, although some plugins, such as Text-to-speech, cannot be used out of the box.

The following table briefly summarizes the information.

02_Method_Table

Preparing for a new project

Integration and project creation

Please refer to the previous blog and complete the steps up to "Creating a Project".

AudioLink - Setup

(Note: If you don't hear any sound after making these settings, try restarting Unreal Engine.)

Wwise Project Settings

Starting in Wwise Authoring:

  1. Create any Sound SFX. Let's name it AudioInput.

03_Create_AudioInput

  1. Set Audio Input as Source.

04_AddSource_AudioInput

  1. Create an Event to play that SFX.

05_Create_Event

This completes the sound input settings on the Wwise side.

 

Configuring Unreal Integration

Now switch to Unreal Editor:

  1. Edit - Project Settings…WwiseIntegration SettingsInitialization - Set Unreal Audio Routing to “Route through AudioLink [UE5.1+]” . After this, you will be asked to restart.

06_Set_Unreal_Audio_Routing

  1. Click the Generate SoundBanks… button and go through the following Generate SoundBanks dialog to generate a SoundBank containing the Events from the added Play_AudioInput.

07_Generate_SoundBanks

  1. Press the Reconcile button.

08_Push_Reconcile

  1. In the Reconcile Unreal Assets dialog, click the Reconcile Unreal Assets button to generate a UAsset. (You can generate a UAsset in several other ways.)

09_Reconcile

  1. From the Add button in the Content Browser, create Audio - AudioLink - Wwise AudioLink Settings.

10_Create_WwiseAudioLinkSettings

  1. Edit the generated NewWwiseAudioLinkSettings and set the Play_Audio Input (AkAudioEvent asset) created above to the AudioLink – Start Event.

11_Set_AudioInput

Now that we're ready to feed the AudioLink output into Wwise, we'll need to configure the Unreal Engine audio output to feed into AudioLink. 

Unreal settings (Optional)

Sound Attenuation Settings​

It can be used for sound methods that allow you to specify Sound Attenuation, such as PlaySound2D at Location and Audio Component.

1. Create AudioSound Attenuation from the Add button in Content Browser.

12_Add_SoundAttenuation

2. Edit the NewSoundAttenuation you created and set the NewWwiseAudioLinkSettings you created above to Attenuation (AudioLink)AudioLink Settings Override. Also, if you do not need attenuation, uncheck Enabled for Attenuation other than AudioLink.

13_Set_NewWwiseAudioLinkSettings

Sound Submix Settings​

Create this configuration if you want to output sound to AudioLink using Sound Submix.

Attenuation, such as PlaySound2D, but please note that it is always enabled.

1. From the Add button in the Content Browser, create an AudioMixSound Submix.

14_Create_SoundSubmix

2. Edit the generated NewSoundSubmix, enable Send to AudioLink in Audio Link, and set Wwise to AudioLink Settings. Specify AudioLink Settings.

15_Set_NewWwiseAudioLinkSettings

That's all for the settings. If you don't want to use it, you can disable it by unchecking Send to Audio Link.

AudioLink - Playing the sound

When playing a sound with Sound Attenuation or Wwise AudioLink Settings specified, the AudioInput will only appear in the Voice Graph during playback.

If Sound Submix is enabled, it becomes difficult to understand the behavior when specifying Sound Attenuation / Wwise AudioLink Settings, so we recommend not using Sound Submix when checking these operations.

When specifying Sound Attenuation (Blueprint node)

To set Sound Attenuation for sound playback from a Blueprint node:

1. Open Level Blueprint from the menu below.

16_Open_LevelBlueprint

2. Right-click on the Blueprint screen and create a BeginPlay Event.

17_Create_BeginPlay

3. Draw a connector off the BeginPlay Event and create a PlaySound at Location node.

18_Create_PlaySoundatLocation

4. Set Sound to Starter_Music_Cue.

19_Set_StarterMusicCue

5. Set NewSoundAttenuation in Attenuation Settings.

20_Set_NewSoundAttenuation

Print in this state and run it from the editor, you should be able to see in the Wwise Profiler that the Starter_Music_Cue is playing via the Play_AudioInput (the meters will work, but you won't be able to see what's playing in Wwise.)

21_Advanced_Profiler

When specifying Sound Attenuation (Audio Component)

Now let's try it with MetaSounds.

1. Create AudioMetaSound Source from the Add button in Content Browser.

22_Add_MetaSoundSource

2. Double-click on the NewMetaSound Source you created, and then right-click on the screen that appears to create a Noise node.

23_Create_Noise

3. Connect the Audio output to Out Mono.

24_Set_OutMono

4. Drag and drop the NewMetaSound Source from the Content Browser into the viewport.

25_Drag_NewMataSoundSource

5. Click on the object placed in the viewport and select AudioComponent from Details.

26_Select_AudioComponent

6. Set NewSoundAttenuation in Attenuation Settings.

27_Set_NewSoundAttenuation

If you start the device in this state, you will be able to listen to MetaSounds’ pink noise via AudioLink.

When specifying Wwise AudioLink Settings

Let's try setting it up using the NewMetaSound Source installed above.

1. In the Audio Component, open Attenuation and enable Override Attenuation.

28_Enable_OverrideAttenuation

2. In Attenuation (AudioLink) - AudioLink Settings Override, set the Wwise AudioLink Settings.

29_Set_NewWwiseAudioLinkSettings

In this case, instead of referencing the Sound Attenuation, you can use the settings in the Wwise AudioLink Settings directly.

Sound Submix

Let’s use PlaySound2D and confirm that it can be played via AudioLink without setting Sound Attenuation, etc.

1. Draw a connector off the BeginPlay Event of the Level Blueprint and create a Play Sound 2D node.

30_Create_PlaySound2D

2. Set Sound to Collapse_Cue.

31_Set_CollapseCue

Please note that Play Sound 2D does not have a place to set Sound Attenuation, etc.

If you compile and run it in this state, you should hear the sound of the Collapse_Cue that has been set.

Even if Sound Submix is disabled, you will still hear the sound, but since it does not pass through Wwise, the meters in Wwise will not move. Conversely, if it is enabled, the meters in Wwise will move.

Conclusion

Using AudioLink to deliver sound from Unreal Audio opens the door to creative tools and techniques that can work in concert with Wwise towards the best representation for your interactive audio experience. It is a way to aid in the prototyping of systems, speed up the development of deeply synchronized audio-visual feedback, and unlock the potential available using both Unreal Audio and Wwise. We would like to continue to provide easy-to-understand explanations of features that work reliably, so if you have any requests for further explanations of these features, please let us know!

Hiroshi Goda

Senior Field Application Engineer

Audiokinetic

Hiroshi Goda

Senior Field Application Engineer

Audiokinetic

Hiroshi is Senior Field Application Engineer and Technical Evangelist at Audiokinetic K.K. (Japan), and provides technical support to customers who have implemented Wwise, or are considering doing so. His career as a game programmer began in 1997 with PlayStation, continuing on until PlayStation3, after which he started working in mobile games. Hiroshi moved on to the cyber security industry in 2016, but returned to gaming in 2021 when he began his current role. With experience in color reduction tools and development of PS1 video standards, he has a patent in layered drawing using 3-value α movies, and he is actually more of a visuals guy rather than audio.

Comments

Leave a Reply

Your email address will not be published.

More articles

Generating Rain With Pure Synthesis

A few years ago, I wondered whether it was possible to synthesize whatever I wanted. I started by...

30.7.2020 - By Aleksandr Khilko

How to Create Temporary VO Assets Automatically with WAAPI + TTS

Introduction Automation is a commonly used approach while working on large projects.In a team with...

4.3.2022 - By Huang Chao (黄超)

ReaWwise: Connecting REAPER and Wwise

Introduction We're excited to present ReaWwise, a new REAPER extension by Audiokinetic that...

22.9.2022 - By Andrew Costa

Developing ReaWwise | Part 1 - Pre-Production

Now that ReaWwise has been released, we thought it would be a good time to share a bit about the...

6.10.2022 - By Bernard Rodrigue

WAAPI in ReaScript (Lua) with ReaWwise

A lesser-known feature of ReaWwise is that it exposes raw WAAPI functions to REAPER, which you can...

13.1.2023 - By Andrew Costa

Realistic Haptics Effects Achieved With Wwise

Introduction My name is Natsuo Koda from Miraisens. I am the CEO of Miraisens, engaged in R&D...

3.4.2024 - By Natsuo Koda

More articles

Generating Rain With Pure Synthesis

A few years ago, I wondered whether it was possible to synthesize whatever I wanted. I started by...

How to Create Temporary VO Assets Automatically with WAAPI + TTS

Introduction Automation is a commonly used approach while working on large projects.In a team with...

ReaWwise: Connecting REAPER and Wwise

Introduction We're excited to present ReaWwise, a new REAPER extension by Audiokinetic that...