Audiokinetic's Community Q&A is the forum where users can ask and answer questions within the Wwise and Strata communities. If you would like to get an answer from Audiokinetic's Technical support team, make sure you use the Support Tickets page.

Beta 2021.1 - Suggestions/ Questions: Object-based Audio Pipeline

+1 vote

Thanks for testing out the Wwise 2021.1 Beta! Here are some resources to get you started and questions we've mapped for feedback.

Object-based Audio

Suggestions: 

  • Be aware of the new authoring setting: Enable System Audio Object.
    • This determines whether Wwise Authoring will allocate System Audio Objects and has a direct impact on the associated Game/ Engine Wwise Runtime.
  • Investigate the System Audio Device settings and meters.
  • Change the Windows Spatial Audio setting to Windows Sonic for Headphones.
  • Change your /Audio/Main Mix Channel Configuration/ to 2.0 (Headphone Panning)
  • Audition different sounds routed to an Audio Bus and change the Bus Configurations (Main Mix, Passthrough, Audio Objects, Channel Configurations, Ambisonics) and notice bus icon and metering changes.
    • Bonus: Can you reach the maximum System Audio Objects used? (as displayed in the the Audio Device Editor)
  • Switch to the Audio Object Profiler Layout
    • Audition sounds and see if you can visualize them in the Audio Object 3D View.
    • Identify the Metadata associated with an Audio Object from the Audio Object List View.
    • Add an effect to an Audio Bus and toggle the Effect Stages in the Audio Object List View for Audio Objects routed to the Audio Bus.
  • Add Metadata for a sound that changes the Mix Behavior to "Mix to Passthrough" and see the changes reflected in the Audio Device Editor for the System Audio Device. 
  • Read information about the use of Object-based Audio across other media.
  • Open the Integration Demo Project /WwiseVersion/
    • Run the Integration Demo
    • Go to Positioning > 3D Audio Object and Spatialized Bed
    • Connect to the Integration Demo from Wwise
    • Open the System Audio Device
    • Open the Master Mixer Console

 

Questions: 

  • Were you able to route sound through the Master Mixer Hierarchy to the: Main Mix, Passthrough Mix, Objects and see their respective metering in the System Audio Device Editor?
  • After working with the Master Mixer Hierarchy and Audio Object List View, can you identify the different Bus Status at a glance?
  • What was the maximum number of System Audio Objects you were able to profile?
  • Were you able to visualize Audio Objects in the 3D Audio Object View?
  • Were you able to see the changes between Effect Stages in the Audio Object List View?
  • Were you able to add custom Metadata for a non-default Mix Behavior and see the changes reflected in the Audio Device Editor for the System Audio Device?
  • Did you read anything interesting about the use of Audio Object in other media?

 

asked Dec 18, 2020 in Beta Feedback by Damian K. (420 points)

1 Answer

+5 votes
The new Bus Status visual feedback is a great addition! I remember that when I got started with Wwise I was confused about bus statuses as there wasn't much feedback in the GUI to indicate what a bus was actually doing. There seems to be a little bug where if you add/remove Metadata to a bus, its status and icon are not immediately updated - I entered a report for this.

In terms of Metadata, at first I found it a bit confusing that you can instance it on both Actor-Mixer Hierarchy containers and Master-Mixer Hierarchy busses but now that I played with it a bit more it makes sense and it looks like a very flexible solution.

Initially I didn't understand what the "Use Default" Mix Behavior meant, but I now noticed the default behaviour is driven by the Positioning settings of the container. As it seems like the System Output Metadata and Positioning settings are intertwined, I wonder if designers would find it easier if those settings were grouped under the same Positioning tab - ultimately, 3D sound settings are still related to the Positional behaviour of a sound. I can imagine though that more fields will be available for customisation in the Metadata tab in the future, so there may be reasons that I haven't considered for how it's currently set up.

Thinking about the implementation of 3D audio in my work project, I believe it would be helpful if the Metadata Mix Behaviour were available to set via an RTPC. If for example we had an Actor-Mixer container for all FPS weapons sounds in the game, it would be great if we could allow the player's weapons to use Passthrough, and all NPC/other players' weapons to use audio objects instead.

Ultimately, I think it'd be great if there was a certification course for Wwise and 3D audio: the topic is still very new, quite complex, and there isn't yet much documentation available. I believe that having a thorough course focused on it would be beneficial to lots of game audio workers.
answered Dec 20, 2020 by Francesco Del Pia (980 points)
...