Hi,
I have experience writing VST plugins for DAWs, and I am learning to write plugins for Wwise.
I have written a VST spatializer plugin, that renders mono and ambisonic audio (B-format) audio to a binaural, with a scene rotator that can be controller via head tracker.
I would like to adapt the code into a Wwise plugin that I would like to run on the Oculus Quest2.
I have noticed in the IAkPlugin header that there are virtual interfaces for performing spatial audio-related processes like `virtual AKRESULT ComputeSpeakerVolumesDirect()` and `virtual AKRESULT Compute3DPositioning()`, so there are clearly interfaces in place for making plugins that perform spatial rendering operations etc.
What I would like to know is what interfaces I am need to implement and what settings macros I need to set in order to start performing ambisonic rendering.
Can anyone please tell me what I need, or point me to the appropriate documentation?
Thanks,
Simon