Wwise Versions
  • Sample Project
  • Wwise Fundamentals
  • Wwise Help
  • Wwise SDK
  • Wwise Unity Integration
  • Wwise Unreal Integration

Other Documentation

  • Strata
  • ReaWwise
  • Audiokinetic Launcher
  • Wwise Audio Lab​
  • Wwise Adventure Game
  • GME In-Game Voice Chat
  • Meta XR Audio

Certification Courses

  • Wwise Fundamentals (2024.1)
  • Wwise Interactive Music (2021.1)
  • Wwise Performance Optimization (2023.1)
  • Wwise Unity Integration (2023.1)

Other Sources

  • Website
  • Videos
  • Plugins
  • Creators Directory
  • Q & A
  • Blog
It adopts analogue recordings and digital mixing technologies, and, therefore, is a classic example of controlling frequency response and loudness. This is a great example. The left radar indicates the loudness when playing the sample from a cellphone. In the processed result for mobile platforms, the average loudness LKFS increased by about 3 dB, but there is no change in LRA. Let's take a look at ...
I mixed the stems in 7.1 with the Ircam’s Spat (each act happens in a different corner of the auditorium). The sound design and music compositions were then granulized and exported into smaller segments and layers. Finally, the real-time designs were made in Wwise and played back with a Soundcaster session. I would say that one third of the time was spent on gathering field recordings, working with ...
Welcome back to our Impacter plug-in blog series. In the previous two blogs we mostly covered the physical parameter aspects of the plug-in and how they can integrate well with your game’s physics system. In this blog we discuss the other aspect of Impacter: its capacity for cross synthesis. As we saw briefly in the first Impacter blog, the possibility of mixing and matching the “impact” and “body” ...
Senua’s world to life, crafting an immersive and unforgettable soundscape.From real-time vocal effects to interactive ambience and responsive mixing, the team walks us through their process of crafting dynamic audio for this cinematic, narratively-driven title. They'll outline the design philosophy behind each decision taken and highlight their workflows - from conception, to Wwise, to Unreal Engine ...
Happy June! Inside this month's highlights, you'll find everything we published recently, including deep dives into engine sound modelling, Wwise tactics for combat audio mixing, and the explosive soundscape of Helldivers 2! Let's get started. Blog Highlights ReadSpeaker and Audiokinetic Introduce speechEngine for Wwise: Runtime On-Device Text to Speech By ReadSpeakerspeechEngine for Wwise is a cross-platform ...
In the same spirit, they showcase a series of Blend Containers in Wwise with many layers and describe the benefits of keeping them in Wwise for easy remixing and fine tuning, while remaining connected with the game.
Because we have a sound, and not a piece of music, right-click inside the Actor-Mixer Hierarchy and choose 'Import Audio Files...'. Then add your audio file and make sure it's 'Creating a new' - 'Sound SFX', and not a 'Sound Voice'. Let's have a listen. Notice that it takes a bit of time for the sound to start. But this sound, should be played instantly on collision. So head into the Source Editor ...
Because we have a sound, and not a piece of music, right-click inside the Actor-Mixer Hierarchy and choose 'Import Audio Files...'. Then add your audio file and make sure it's 'Creating a new' - 'Sound SFX', and not a 'Sound Voice'. Let's have a listen. Notice that it takes a bit of time for the sound to start. But this sound, should be played instantly on collision. So head into the Source Editor ...
This educational video contains supportive content to lesson 4 from the Wwise-101 Certification course. To follow along and access complete course content, please visit: https://www.audiokinetic.com/courses/wwise101/ Topics: 01:17 Using an Actor-Mixer to Pan Multiple Objects 02:33 Using Balance-Fade Positioning
Kastbauer 00:22:48 Auto-Defined SoundBanks, Michael Cooper 00:37:26 Reflect Simplified, Thalie Keklikian 00:47:31 3D Audio Bed Mixer, Philippe Milot 00:55:46 Future Livestream Schedule, Damian Kastbauer 00:59:22 Introduction Head of Product, Simon Ashby 01:03:33 Strata Product Video 01:04:38 Strata Overview, Simon Ashby 01:19:29 Strata Production Partner Introduction, Simon Ashby 01:22:21 Strata - ...
Declaration of callback prototypes 31  32  33 #ifndef _AK_CALLBACK_H_ 34 #define _AK_CALLBACK_H_ 35  36 #include <AK/SoundEngine/Common/AkCommonDefs.h> 37 #include <AK/SoundEngine/Common/AkMidiTypes.h> 38  39 namespace AK 40 { 41  class IAkGlobalPluginContext; 42  class IAkMixerInputContext; 43  class IAkMixerPluginContext; 44 } 45  46 /// Type of callback. Used ...
App audio will be interrupted if another audio app activates its audio session in the foreground.App audio will be interrupted if another audio app activates its audio session in the background under a mixable or non-mixable category.This category is also recommended for game apps which use audio that should not be mixed with other audio apps running in the background. For example ...
The user interface has also been streamlined to show Randomizer, Link/Unlink, and RTPC icons in mixed states. Before    Wwise 2016.2.0   Play Source in Authoring Play Source is a playback option that bypasses all hierarchy settings like volume, RTPC, States, and Effects. This is quite practical when you need to compare the audio source imported into Wwise with the one affected by all the hierarchy's ...
After a few years of studying electroacoustic music at the University of Montreal, I couldn’t help but wonder about what I would do with my degree. How would these teachings apply outside of academia? How could Pierre Schaeffer’s Étude au Chemin de Fer have anything to do with foley? How could Ake Parmerud’s Grain of Voices be related in any way with mixing and editing in the “real world”? Or how ...
Note that all emitters sent to a room reverb are downmixed to mono by said reverb, thus losing their individual directionality [27]. Now why would you rotate the sound field outputted by a reverb? Most artificial reverberators will, by design, produce an isotropic signal (that is, energy arriving equally from all directions), so if you rotate them you should not hear any difference. This is the case ...
The music then moves along to Composition and Virtual Orchestration, leading into Orchestration/Conducting and Recording/Mixing. This is where the integration process begins using the Wwise Editor and Game Editor to then fully integrate the Wwise project into the game. As we walk the path from sound direction to game integration, we see a high percentage of professional participation early on in that ...
Duplicating a track and applying a 6dB attenuation to each copy and then distributing them within different threat levels has proven to be an effective solution in some cases, particularly for Trapanese remixes, with less dense orchestration and where spectral division alone was not enough. Posture attacks In Shadows, players have the ability to hold attacks, causing characters to enter a stance that ...
We're excited to announce that the Meta XR Audio SDK Plug-in is now live in Wwise! This powerful plug-in delivers the core audio features required to create immersive experiences in mixed reality for Meta Quest headsets and other devices. With the Meta XR Audio SDK, you have access to the foundational components of spatial audio and acoustics simulation, enabling your apps to provide an immersive ...
Michael La Manna
Creators Directory
Michael has over 14 years of experience in the music and tech industry, from composing for film, television and game production to starting Jacksonville Florida's first independent game studio. Michael began his career in the late nineties as the founder of the electronic group Sonic State. He released remixes for artists such as Limp Bizkit, Orgy and Godhead, and was propelled into the underground ...
Murray Daigle
Creators Directory
Music and Audio producer/engineer/mixer.