Wwise Versions
  • Sample Project
  • Wwise Fundamentals
  • Wwise Help
  • Wwise SDK
  • Wwise Unity Integration
  • Wwise Unreal Integration

Other Documentation

  • Strata
  • ReaWwise
  • Audiokinetic Launcher
  • Wwise Audio Lab​
  • Wwise Adventure Game
  • GME In-Game Voice Chat
  • Meta XR Audio

Certification Courses

  • Wwise Fundamentals (2024.1)
  • Wwise Interactive Music (2021.1)
  • Wwise Performance Optimization (2023.1)
  • Wwise Unity Integration (2023.1)

Other Sources

  • Website
  • Videos
  • Plugins
  • Creators Directory
  • Q & A
  • Blog
Hi,I want to play audio through a secondary output (HTC Vive headphones) while at the same time play audio through the main output (TV speakers) for a single player VR experience. I am using Unity3D 2019.3 and Wwise 2019.2.1. I am able to play audio out of the two audio devices in Wwise from following this guide by Ed Kashinsky by creating a separate Audio Device Shareset "System_VR" and a new ...
I tested by myself and found a solution: You can create a Music Segment containing MIDI Clip with a single MIDI note as a Stinger (to play your stinger in actor-mixer hierarchy), so that their playing time can be scheduled by an interactive music system. Then move the music content you want to play from the interactive music hierarchy to the actor-mixer hierarchy. There are two advantages to this: ...
We've been experiencing this issue, and I found a culprit that fixed our case today. I'll explain what I did, and hopefully, it will help anyone who may have encountered it. I believe this to be a bug. We recently enabled spatial audio on our project and have been experiencing this message excessively: "3D audio object limit exceeded; object xxxxx instigated by voice/bus will be mixed." Upon thorough ...
A music event cue feature has also been added in version 2019.1, providing you with the ability to influence parameters related to game mixes by posting event with the music timeline Hope this helps
If so, is there any Ak API that could help us convert raw data to stereo channels according to positions? It seems channels could be mixed on application side using the following utilities, but we don't have access to AkDevice (at least). CAkSpeakerPan::ComputePositioning AkMixer::MixNinNChannels - Is there any configuration flag we could toggle so the capture callback will provide us with stereo ...
IMHO, this had better return a list of paths, each corresponding to a valid language of the project.  Similarly, if a SoundBank has mixed localized and non-localized content, one would expect the query to return the paths to all existing BNK files.  Hope this helps.   Thank you Beinan for reporting the issue. WG-43852 was created.
Did you ever find a solution? Currently I'm experimenting with a mix of stingers, event cues and midi triggers, but haven't found the perfect solution yet
You are not allowed to use existing sounds. You can however re-use the existing actor-mixer stuctures, but this is not something I would personally recommend. It is probably easier to start from scratch, by using only the events and game parameters from the project. Great stuff, thanks very much for the reply!
can not assign Auxiliary Bus for the "actor-mixer"(main charactor) (in lesson 5 course wwise101) I am having this same problem. Did you ever find the answer? I cannot for the life of me figure this one out...,I had the same problem. I deleted the newly created Aux Bus (env_corridor), then created a new aux bus. Before assigning an effect to that aux bus, I went and routed the audio from the main character ...
Wwise side of things, I have my sound file imported into the Wwise project and assigned to an output bus that has the OSP added in the Mixer Plug-in tab after creating an Audio Bus. I have the Positioning settings for the sound source set to 3D and the position source set to Game Desgined. Is there anything that is missing here that may be causing the OSP not to spatialise the sound it's receiving?
I'm trying to build project in android studio with NDK but have an error when linking Wwise libAuroPannerMixer.a. NDK version 19.2, compile with clang, wwise version 2017.1.3.6377. Does someone have the same problem? Application.mk APP_STL := c++_static APP_CPPFLAGS := -frtti -DCC_ENABLE_CHIPMUNK_INTEGRATION=1 -std=c++11 - fsigned-char -Wno-extern-c-compat APP_LDFLAGS := -latomic -stdlib=libc++ -lstdc++ ...
 I am looking for some insight into the best way to create a distance crossfade for 3rd person guns. I'm in the process of remixing my game using the new HDR system, and in doing so I wanted to try and clean up my messy hack for distance crossfades, and improve on the system.   My current set up for 3rd person guns goes as follows- Parent is a Random looping container,  ten blend ...
Hi, I would like to use the wwise meter plug in to drive a Time stretch effect on an object. The Time Strech pulgin isn't working in an audio bus, so I wonder how I could use both of these plugins together. Do you have any idea? the basic idea is to use a random container with some glitch in it. This random container is in an audio bus with the Wwise meter plugin. I have an sfx in a actor mixor.
Hi, When I use the WAAPI call ak.wwise.core.object.getPropertyAndReferenceNames, how can I best use the results of this information to find all the Event references for that Audio object? What would the WAAPI call(s) and results look like if I want to get the Event references for a given Audio object? Thanks! ak.wwise.core.object.getPropertyAndReferenceNames is not what you are looking for. ak.
Hi there, I'd like to be able to use an external midi controller to control stuff within the Music Hierarchy but it seems I can only do this within the Actor-mixer one, is it feasible to get this working for music objects too? Or - ideally - be able to control RTPC game syncs within the Soundcaster sessions using it. Basically it's just a little easier to use the midi controller whilst playing the ...
I want to create a structure of ActorMixers, Sounds and Switch Containers. I can create the objects themselves using ak.wwise.core.object.set but I don't see a way to assign the switch container states through that call. The only option I see is to create the objects using object.set, get the current assignments for each switch container individually using ak.wwise.core.switchContainer.getAssignments ...
Hi, i wonder if the default mode is to downmix stereo-channel files to mono if they are positioned in 3d? //David. Hi David, All channels are preserved and panned around the listener according to the 'Spread' (and 'Focus' starting with 2014.1) properties to make these channels more or less "point source". A classic example would be to have a stereo 'waterfall' SFX that is point-source from distance ...
Hi all, I would love some clean way to return the common directories listed in the defining custom commands page below. Right now it's a pretty convoluted work around having to use "ak.wwise.core.object.get" with some WAQL magic to get directories. Ideally would be a way to modify these child directories as well ie. ${WwiseProjectRoot}/Actor-Mixer Hierarchy or something along those lines to avoid ...
When playing music from the music player, wwise stops every sound, as it should be, but when the user pauses the music player, no sound is heard anymore, not even when new PostEvent are sent. Is there any functions to call when the user pauses the music player I can call to resume wwise activity? Thanks! What audio session category do you use? By default, Wwise uses kAudioSessionCategory_SoloAmbientSound.
Each spell sounds are inside a random container (PlayerFire_Fire, PlayerFire_Ice, PlayerFire_Lightning), and all three of them are routed inside an Actor-Mixer. This Actor Mixer allows me to make a Positioning automation to match the movement of the player character for all three of the spell in one go. Every random container share the same settings (I copy/pasted/renamed the first one to make sure ...