Wwise Versions
  • Sample Project
  • Wwise Fundamentals
  • Wwise Help
  • Wwise SDK
  • Wwise Unity Integration
  • Wwise Unreal Integration

Other Documentation

  • Strata
  • ReaWwise
  • Audiokinetic Launcher
  • Wwise Audio Lab​
  • Wwise Adventure Game
  • GME In-Game Voice Chat
  • Meta XR Audio

Certification Courses

  • Wwise Fundamentals (2024.1)
  • Wwise Interactive Music (2021.1)
  • Wwise Performance Optimization (2023.1)
  • Wwise Unity Integration (2023.1)

Other Sources

  • Website
  • Videos
  • Plugins
  • Creators Directory
  • Q & A
  • Blog
C# anonymous types var testObj = await client.Call( ak.wwise.core.@object.create, new { name = "WaapiObject", parent = @"\Actor-Mixer Hierarchy\Default Work Unit", type = "ActorMixer", onNameConflict = "rename" }, null); Thanks a lot! The manual of 2019 version has a greatful upgrade from the 2018, it's became more useful.
Today, Nuendo empowers audio professionals to higher levels of productivity: discerning post-production editors, film mixers as well as recording engineers around the world rely on the flexibility and industry openness that Nuendo provides. Game sound designers increasingly depend on Nuendo; its highly configurable program structure and many sound-design tools help to create and manipulate audio.
To that end specifically, to help the interactive score sit in the mix during gameplay, we use a system that Paul Lipson (the audio director on Halo Wars 2) developed when he was at Microsoft. It adapts to the submixes of the music in Wwise to help offset some of the masking and mud you’d get when intense sound design moments collide with music in the mix. For example, if there are a ton of explosions ...
Physically, a throttle controls the amount and mixture of air (intake) and gasoline flowing through our wind  instrument. With complex engine simulations there can be more parameters, but this set (of three) is what’s needed to simulate the basic engine sound behaviour.Next we’ll have a look at how Rpm and Load work. If you’re familiar with these, feel free to skip over all the way to asset creation.
Ultimately, we felt that grounded and futuristic was more relatable to a current player base and fit the narrative better; and this decision is felt in how the team built much of the audio for the experience. For our embodiable and playable characters we used a combination of effects, position, and mixing to give the player a good sense of who they are in the scene. For David this meant creating a ...
Last year, I started exploring various approaches to spatialization using Unreal Engine and Wwise as part of my final project for my degree at the University of Edinburgh.This unearthed some exciting results that I’d like to share with you in this blog. From audio files to endpoint (Audio Objects, Main Mix and Passthrough Mix) First, let’s discuss the three bus configurations and how I have approached ...
Introduction A while back, we were joined by special guests Stefan Randelshofer and Matheus Vilano on a livestream, who lifted the hood on mixing Baldur's Gate 3. They went hands-on in Wwise, and dived deep into the ideation behind the systems they implemented. From tools and techniques, to achieving frequency-based clarity with side-chaining, to using the Mastering Suite - they did a thorough walkthrough ...
To avoid any clashes with the music playing at the time, I placed the main musical content into its own mix bus and then the stingers into a second isolated mix bus and applied a channel ducking function. So anytime a musical sting was triggered the mixer would simply drop the volume of the main music immediately, allow the sting to play and then slowly over 2 or 3 seconds raise the main music back ...
Teaching Andre how to mix music seemed straightforward. Teaching him sound design for games and audio implementation with Wwise was a different matter entirely.To complicate matters, Andre would be learning on a Mac, and Mac's VoiceOver accessibility tool cannot speak most of the interface in Wwise, rendering it invisible to a student like Andre. We spent several weeks trying to devise a solution ...
And there are a host of other new features that speak directly to interactive mixing, for example, building on the Meter Effect to arrive at the Multiband Meter Effect, the addition of audio-rate sidechaining capabilities, and the ability to mute and solo Real Time Parameter Controls (RTPCs). These controls provide incredible new ways to shape runtime dynamics. Meanwhile, quality of life updates to ...
This is an obvious argument for a clear and well-mixed game where audio cues can always be heard along with their visual. Audio produces a stronger physiological response than Visuals alone This is mostly common knowledge at this point. We all know that music and sound along with a visual is significantly more arousing or stressful than a visual by itself. This is significant to performance however ...
atmoky Ears is the one stop solution for rendering hyper-realistic spatial audio experiences to headphones. It provides an unparalleled combination of perceptual quality and efficiency, whilst getting the best out of every spatial audio mix. atmoky Ears puts the listener first and offers a patented perceptual optimization. For those who want to squeeze out the very last drop of performance from their ...
It consists of a few small stateless helpers that accept WaapiClient as argument, so they can be mixed up with vanilla waapi-client code. All functions follow convention regarding getting properties, such that if a property doesn’t exist, the value should be None, plain and simple. I won’t be going into details here, as examples ahead will do a better job demonstrating what it looks like. Examples ...
Hoffman (Sound Designer, Insomniac Games)What is haptic feedback? How are haptics created? Is it possible to author, manage, and mix haptics within Wwise? Haptic feedback is an important feature in video games and new technology is changing the way players feel and connect with gaming experiences. Sound designers Rodrigo Robinet and Tyler Hoffman will provide a high-level overview of haptic feedback ...
In our previous blog, Simulating dynamic and geometry-informed early reflections with Wwise Reflect in Unreal, we saw how to mix sound with the new Wwise Reflect plug-in using the Unreal integration and the Wwise Audio Lab sample game. In this blog, we will dive deeper into the implementation of the plug-in, how to use it with the spatial audio wrapper, and how it interacts with the 3D-bus architecture.
Like the first method, it relies on pre-composed material, but allows for control of the mix. For example, a user could choose between one of several bass lines, or elect to have a horn section provide an accompaniment. Again, a navigable tree structure in the background could control groups of tracks and lead to logical musical choices. To gain more interactivity, the third level calls for asynchronous ...
For each player, voice chat mainly involves two audio stream linkages: the upstream linkage where the local mic captures the player's own voice and distributes it to remote teammates through the server, and the downstream linkage where the voices of all teammates are received from the server, mixed, and played back on the local device.Upstream linkage:The player's local chat voice stream will be sent ...
Setup  Wwise Project Settings  Configuring Unreal Integration  Unreal Settings (Optional)    Sound Attenuation Settings    Sound Submix Settings​AudioLink - Playing the sound  When specifying Sound Attenuation (Blueprint node)  When specifying Sound Attenuation (Audio Component)  When specifying Wwise AudioLink Settings  Sound SubmixConclusion What is AudioLink? AudioLink is an Unreal Engine feature ...
This educational video contains supportive content to lesson 5 from the Wwise-101 Certification course. To follow along and access complete course content, please visit: https://www.audiokinetic.com/courses/wwise101/ Topics: 00:49 Understanding Property Offsets 02:08 Understanding the Actor-Mixer / Master-Mixer Relationship 04:51 Using Schematic View 05:47 Using the Voice Profiler
States then it can choose to move through them by using a custom sequencer unique to that system, or by using a generic sequencer that switches between them randomly over the course of however long the system decides to run for. Beyond this top level mechanic, what the systems actually look like in Wwise can be broken down as follows.  ACTOR-MIXER SYSTEMS   The relatively simple systems are the ones ...