Community Q&A

Welcome to Audiokinetic’s community-driven Q&A forum. This is the place where Wwise and Strata users help each other out. For direct help from our team, please use the Support Tickets page. To report a bug, use the Bug Report option in the Audiokinetic Launcher. (Note that Bug Reports submitted to the Q&A forum will be rejected. Using our dedicated Bug Report system ensures your report is seen by the right people and has the best chance of being fixed.)

To get the best answers quickly, follow these tips when posting a question:

  • Be Specific: What are you trying to achieve, or what specific issue are you running into?
  • Include Key Details: Include details like your Wwise and game engine versions, operating system, etc.
  • Explain What You've Tried: Let others know what troubleshooting steps you've already taken.
  • Focus on the Facts: Describe the technical facts of your issue. Focusing on the problem helps others find a solution quickly.

0 votes
Hey there. After spending a while with the Unity integration, I'm noticing a re-ocuring pattern. It seems to be designed that every object that we want to emit sound on needs to be configured beforehand in the unity editor with an AKGameObj setup. For example, if I simply post an event to an object at runtime via code, there's no way that isStaticObject will be set to true on that object, so even stationary objects will be running a whole boatload of Update() calls.

There are several other small examples where a front-heavy design process where every possible object that could emit sound is set-up beforehand is prefered. My question is - is this a false read on my part? What is the expected design pipeline? Is there a document that I haven't read? More importantly, is there an expected setup whereas designers like myself can simply post an event on an object without needing to do setup first?

Many thanks!
in General Discussion by John W. (110 points)

Please sign-in or register to answer this question.

...