Audiokinetic's Community Q&A is the forum where users can ask and answer questions within the Wwise and Strata communities. If you would like to get an answer from Audiokinetic's Technical support team, make sure you use the Support Tickets page.

How do I get spatialization data into my custom Wwise plugin?

0 votes
Hi,

I have experience writing VST plugins for DAWs, and I am learning to write plugins for Wwise.
I have written a VST spatializer plugin, that renders mono and ambisonic audio (B-format) audio to a binaural, with a scene rotator that can be controller via head tracker.
I would like to adapt the code into a Wwise plugin that I would like to run on the Oculus Quest2.

I have noticed in the IAkPlugin header that there are virtual interfaces for performing spatial audio-related processes like `virtual AKRESULT ComputeSpeakerVolumesDirect()` and `virtual AKRESULT Compute3DPositioning()`, so there are clearly interfaces in place for making plugins that perform spatial rendering operations etc.

What I would like to know is what interfaces I am need to implement and what settings macros I need to set in order to start performing ambisonic rendering.
Can anyone please tell me what I need, or point me to the appropriate documentation?

Thanks,
Simon
asked May 13, 2022 in General Discussion by Simon D. (100 points)

1 Answer

0 votes
Hi Simon,

You can look at the documentation for Object Processors, which are a superset of effects that are aware of audio objects, to implement your binauralizer as an Out-of-Place Object Processor that aggregates all its incoming streams into a stereo output object stream.

Software Binauralizer example in the documentation: https://www.audiokinetic.com/library/edge/?source=SDK&id=soundengine_plugins_objectprocessor.html#soundengine_plugins_objectprocessor_outofplace_binauralizer.
answered May 17, 2022 by Samuel L. (Audiokinetic) (23,220 points)
That looks like exactly the documentation I need at this stage, thank you so much!
...