Community Q&A

Welcome to Audiokinetic’s community-driven Q&A forum. This is the place where Wwise and Strata users help each other out. For direct help from our team, please use the Support Tickets page. To report a bug, use the Bug Report option in the Audiokinetic Launcher. (Note that Bug Reports submitted to the Q&A forum will be rejected. Using our dedicated Bug Report system ensures your report is seen by the right people and has the best chance of being fixed.)

To get the best answers quickly, follow these tips when posting a question:

  • Be Specific: What are you trying to achieve, or what specific issue are you running into?
  • Include Key Details: Include details like your Wwise and game engine versions, operating system, etc.
  • Explain What You've Tried: Let others know what troubleshooting steps you've already taken.
  • Focus on the Facts: Describe the technical facts of your issue. Focusing on the problem helps others find a solution quickly.

0 votes
Hi,

I have experience writing VST plugins for DAWs, and I am learning to write plugins for Wwise.
I have written a VST spatializer plugin, that renders mono and ambisonic audio (B-format) audio to a binaural, with a scene rotator that can be controller via head tracker.
I would like to adapt the code into a Wwise plugin that I would like to run on the Oculus Quest2.

I have noticed in the IAkPlugin header that there are virtual interfaces for performing spatial audio-related processes like `virtual AKRESULT ComputeSpeakerVolumesDirect()` and `virtual AKRESULT Compute3DPositioning()`, so there are clearly interfaces in place for making plugins that perform spatial rendering operations etc.

What I would like to know is what interfaces I am need to implement and what settings macros I need to set in order to start performing ambisonic rendering.
Can anyone please tell me what I need, or point me to the appropriate documentation?

Thanks,
Simon
in General Discussion by Simon D. (100 points)

1 Answer

0 votes
Hi Simon,

You can look at the documentation for Object Processors, which are a superset of effects that are aware of audio objects, to implement your binauralizer as an Out-of-Place Object Processor that aggregates all its incoming streams into a stereo output object stream.

Software Binauralizer example in the documentation: https://www.audiokinetic.com/library/edge/?source=SDK&id=soundengine_plugins_objectprocessor.html#soundengine_plugins_objectprocessor_outofplace_binauralizer.
by Samuel L. (Audiokinetic) (23.6k points)
That looks like exactly the documentation I need at this stage, thank you so much!
...