Andrew Souter
Active Member
I ended up getting Breeze 2 to go with Precedence because impatient me couldn't wait. Once you load up instances of both Breeze and Precedence in each channel (and make sure to click the link for both) it completely works. These people are wizards. The combo goes super well with dry samples and physically modeled instruments. I cannot wait until B2 links with Precedence.
Thanks for the compliments! Glad you like them!
The only suggestions I would have for the developer (for each plugin) are to make a context box on the top or bottom that briefly explains what each button does when hovering over it,
Yes, a "help mode" like this would be super useful for all our products. It's definitely on the wish list!
and also to allow automation for placement in Precedence to simulate movement in the spacial environment. But I realize that last request probably wouldn't work without popping and clicking from the algorithm changing in real time. Still... this is amazing.
Automation of Position (i.e. Distance and Angle) are tricky for many reasons:
1) much of the dsp used is done as minimally as possible as to keep CPU usage down, so we don't always do things like "delay line interpolation" if delay lines don't need to change. Without adding it, it's simply not possible to change a delay length smoothly for example. There are other examples. Adding everything needed to allow smooth automation would increase the CPU usage drastically. Maybe 4 or 5 times as much as it uses currently.
2) automation with Multi-Instance Editing gets a bit tricky
3) automation with Precedence-Link gets even more tricky bc now Breeze must update it's entire alg in a smooth manner too, and that is VERY CPU intensive. If Precedence is smooth but Breeze is not, it still does not really solve the goal perfectly.
Probably we will choose to make a dedicated motion-FX/doppler kind of plug-in derived from Precedence that is designed for smooth automation and extreme motion FX. Such a plug would likely be more limited in it's usage than Precedence where we might expect to have an instance on almost every track in a project. So Precedence would remain light-weight and focused on positioning and the derivative product would handle special effects if/when needed.
EDIT: Also the option of sound direction for each instrument like MIR Pro would be nice. Not sure how feasible that is or even how beneficial it would be.
I have read academic papers espousing the importance of directivity in instrument models, but as a composer/sound-designer/mix-engineer, I am not 100% convinced this is particularly helpful or meaningful to include in applications targeted at making music sound good. Directivity of an instrument would more or less translate to variable energy/gain at the microphone pickup, in a manner similar to microphone directivity patterns. I am not sure that is musically desirable. It seems rather annoying actually to have to adjust gain of the track again just bc you rotated the direction of your instrument relative to the virtual microphone location. This is my intuition at the moment; perhaps it will change over time, but this is my current thinking.
In theory I suppose this would also have some effect on things such as direct to reflected energy balance since it would be logical to assume the reflected energy would be close to omnidirectional, and so if an instrument was highly directional and pointing away from the microphone the direct signal would be attenuated drastically, while the reflected signals would be mostly unaffected.
In terms of more relevant musical psychoacoustic cues this would effectively mean:
1) more initial density/diffusion
2) higher wet/dry mix value
3) probable reduction in Pre-delay
4) potential "instantaneous diffusion" and increase to "audio source width" do in part to all of the above.
5) amplitude modulation if position/rotation is changing such as string players swaying to the music
Precedence and Breeze already allow you to make adjustments to all of these in a more direct fashion. This is what listenens will experience. They will not have any idea if an instrument is rotated or not. So at the moment I think working directly in the domain of the psychoacoustic RESULT is more powerful and intuitive than woking one up one level at (one of its) CAUSE(s).
To illustrate: ask yourself, will you ever think "I want to rotate the second violin to point 30 degrees away from the mic and 60 degs up towards the sky/ceiling"? Probably not. But you may very well think I want more density/diffusion, a wetter result, less pre-delay, more width, more or less modulation etc. These things are easier to think about and experience directly IMHO.
That's my current thinking at least. "Directivity" is still an interesting academic topic of research however, so maybe my opinion will change eventually... we'll see.