Wwise Versions
- Sample Project
- Wwise Fundamentals
- Wwise Help
- Wwise SDK
- Wwise Unity Integration
- Wwise Unreal Integration
Other Documentation
- Strata
- ReaWwise
- Audiokinetic Launcher
- Wwise Audio Lab
- Wwise Adventure Game
- GME In-Game Voice Chat
- Meta XR Audio
Certification Courses
- Wwise Fundamentals (2024.1)
- Wwise Interactive Music (2021.1)
- Wwise Performance Optimization (2023.1)
- Wwise Unity Integration (2023.1)
Other Sources
- Website
- Videos
- Plugins
- Creators Directory
- Q & A
- Blog
Intermediate spatial representation for the purpose of binauralization VR games need to render a binaural mix for headphones, produced by passing sounds through position-dependent filters called Head-Related Transfer Functions (HRTF). These filters model the interaction of sounds with the head, and can succeed in giving the impression that sounds of the game are actually coming from outside the player's ...
In NMS, how are music and sound effects interacting together? What was your approach towards mixing those 2, and do you have any recommendations on how to mix music and SFX dynamically? PW: I always mix as I go, the mix process wasn’t as difficult as you might expect and as a PS4 title, we’re mixed to the EBU R128 standard. Whilst there’s a lot of randomisation in the game, I always know the upper ...
The presentation was concluded with an amazing set of clips that compared the exact gameplay sequence before and after their dynamic mixing system. These clips are simply fantastic as they show exactly how much more comfortable and enjoyable gameplay becomes once you hear only what?s really important at any given gameplay moment.
Pawe? presents the initial goals for ambiance in the game, such as having: Enough content to fill this huge open world Sound variations to avoid repetition Mechanisms to mimic real world Control mixing on the fly for storytelling purposes Straightforward implementation Pawe? then shows how their global weather system has been implemented in Wwise using Blend Containers reacting to wind intensity parameters ...
Fun with Feedback
Blog
Wwise project so that you can follow along, but it's not a problem if you do not. {{cta('5216a365-a602-48e1-ad27-6fce09d92a05')}} Feedback in the Mixing Hierarchy If you create an aux send from a bus to another bus that is lower down in the hierarchy, it is possible to create feedback loops in Wwise. Wwise mixes audio from the leaves of the bus hierarchy and works its way up to the master audio bus.
Tutorial video in Audiokinetic Wwise showing how to create Interactive Music Stingers add. To add more feedback to your interactive music, you can play stingers at key points in the game action. Stingers are brief musical phrases that are superimposed and mixed over the currently playing music. To play a stinger, the game calls a trigger that is associated with a stinger music segment.
Music Design & “Project Hiraeth”: Why I Spent a Year Making a Score for a Game that Doesn’t Exist
Blog
Project Hiraeth is the world’s first video game score license to be fully adaptive, professionally mixed and mastered, and recorded with a live orchestra. It features over ninety minutes of original music, responds naturally to player choices, and comes with all the scripting needed to fully integrate into a game in under a week. This blog post explores why, and how, it was created. A GOOD MORNING ...
Topics included: setting expectations when coming onto a new project as a freelance composer, whether implementation is a necessary skill for composers today, thoughts on specialization and getting pigeonholed into that specialty, how to hang on to certain rights, thoughts on mixing sound design and composition work in your portfolio and rescoring existing games in your reels, and overall advice for ...
Wwise, Interactive Music Hierarchy 1:09:49 - In-game sequence with music while muting engine sounds in authoring 1:15:05 - Mixing 1:16:40 - Object-based audio 1:24:04 - Challenges with Event-based packaging 1:29:24 - Thank you Paolo! 1:29:50 - Shoutouts to Paolo's team Links for the Livestream: What's New Beta: https://blog.audiokinetic.com/wwise2022.1-whats-new/ Migration Guide: https://www.audiokinetic.
Router to target specific hardware channels 1:01:20 - Musical Examples from the "Mountains" environment 1:05:40 - Scupltural elements and mixing considerations 1:13:50 - Flexibilty and adaptation at the event in the space 1:22:43 - Musical Examples from the "Games" environment 1:29:17 - Summary of the project 1:34:04 - Outro Audiokinetic: https://hubs.ly/H0C1vhJ0?. Audiokinetic Blog: https://hubs.
Profiling 01:03:17 Wwise 2023.1 - CPU Timeline 01:04:39 Wwise 2024.1 - Memory Allocator 01:05:45 Profiling - Metering, Mastering, Dynamic Mixing 01:06:08 Profiling - Audio Objects, Game Objects, and Acoustics 01:06:35 Focused and Iterative Realtime Workflow 01:06:49 Increased Visibility and Focused Audio 01:06:56 Summer of Beta - Redux 01:07:52 Wwise Free for Indies 01:09:10 Wwise Ecosystem 01:10:02 ...
From PC to Mobile: Challenges & Breakthroughs in the Audio Development of Naraka: Bladepoint Mobile
Videos
Shanghai 2024 covers three key aspects of audio porting for mobile: adapting the game?s audio for mobile devices, implementing real-time spatial audio solutions, and designing Dolby Atmos mixing strategies. Watch as they share technical approaches, solutions, and results in Naraka: Bladepoint Mobile. 00:00:00 Project introduction 00:02:12 Mobile audio porting 00:10:49 Spatial audio solution for mobile ...
All of its sounds are in the Weapon Actor-Mixer shown in the image below. The Reflect plug-in is meant to be inserted on an Auxiliary Bus. For the weapon's sounds to be routed to the aux bus with Reflect applied, we need to enable Use game-defined auxiliary sends in the General Settings of the Property Editor. Then, we need to add an aux bus with the Reflect plug-in. In our example, the aux bus is ...
Ambisonics Since version 2016.1, Wwise has supported ambisonic channel formats for busses. Wwise takes care of encoding (mixing a non-ambisonic signal to an ambisonic bus) and decoding (mixing an ambisonic signal to a non-ambisonic bus). This allows users to mix multiple sounds in the environment to an ambisonics bus, and then convert that ambisonics bus to binaural using a 3D audio plug-in such as ...
About the Example Project Once you have the example project open in Wwise 2019.2, take a quick look around to see what Actor-Mixer objects and RTPC game syncs are included. You will see that notes are included with almost every object, to document a specific function or a related feature.In the Project, the key engine parameters which we looked at over Part 1 are as follows: 1. RTPC_Player_Example ...
Being able to mix a game that requires the music to slap and the sound effects to always be audible.3. Syncing the beat. Transitions and effects Josh Sullivan: I was in charge of all the sound for BPM. My process for creating/mixing sound is quite a strange one. I have more than a decade of experience in video production, so when it came to using a program to mix and mash the sounds that would go ...
What is the expected behaviour when mixing auto-defined soundbanks and a custom IAkLowLevelIOHook? Thank you!
They will sound louder, and sometimes much louder at the same volume level, if mixed with the string instruments (with an attack usually above 50ms). Another example is a gunshot which has an attack shorter than 10ms. It will sound even louder than the percussions. So, sometimes loudness problems can be fixed by just tweaking the asset's attack. Duration and Reverberation The duration of a sound ...
PopCap Games composer Guy Whitmore discusses the advantages of scoring your game, within the game
Blog
A common practice for games composers today is to compose and arrange fully mixed music cues in Pro Tools or Logic, then have an implementation specialist drop those files into the game. Music integration, in this case, is seen as a basic technical task. But to score a game with greater nuance, the composer would want to see the title in action while composing and arranging, working in a digital audio ...