La section Questions et réponses de la communauté Audiokinetic est un forum où les utilisateurs de Wwise et de Strata peuvent poser des questions et répondre à celles des autres membres de la communauté. Si vous souhaitez obtenir une réponse de la part de l'équipe de soutien technique d'Audiokinetic, veillez à utiliser le formulaire de Tickets de Soutien.

0 votes
I need to play the sound in stereo when the listener player and the emitter of sound are both in one area (for example, in one room) and play sound in mono, when the sound emitter and listener are in different areas, separated by some geometry (for example, when the listener hears sound occluded by the wall). Is there any way to do so automatically for all sounds being occluded?
I'm using unreal engine 5.1 and Wwise 2022.1.2.8150
dans General Discussion par Anton Z. (150 points)

1 Réponse

–1 vote
 
Meilleure réponse
Hey Anton.

The quick answer is no.

When you convert your original sounds into game-ready sound files, the Conversion ShareSet will define channels, audio codec, and so on. If you've set it to stereo (or it's already a stereo file and you set it to "as input"), it will result in playing 2x channels in-game and you can use your Attenuation ShareSet to "Spread" out the channels when close to the emitter. When no spread is applied, your channels will be played from exactly the same place, meaning it'll be perceived as a mono source, but any filtering, effects, etc. will be applied twice - 1x for each channel.

That said, you could probably play both a 2ch stereo sound and a 1ch mono sound on the same game object, use the new Spatial Audio Attenuation curves to control which one is being played, and let the virtual voice system control which sound will get processed.
par Mads Maretty S. (Audiokinetic) (40.2k points)
edité par Mads Maretty (Audiokinetic)
...