Creating Depths with BREEZE2

Discussion in 'Mixing, Post-Production, and Effects' started by Beat Kaufmann, Aug 29, 2018.

  1. Beat Kaufmann

    Beat Kaufmann Active Member

    Last edited: Aug 30, 2018
    JaikumarS, MarcelM, Jazzy_Joe and 2 others like this.
  2. pipedr

    pipedr Member

    Apr 19, 2018
    Thanks for the tutorial.

    When using Breeze2 to create the soundstage, do you need another tool for left right, like virtual sound stage or panagement or some other stereo tool?
  3. OP
    Beat Kaufmann

    Beat Kaufmann Active Member

    Audio channels which are routed into a BREEZE2 (inserted in a group track)
    No, left and right you can set with the balance tool of each audio track in the DAW. Maybe then you collect some instruments in a group track from there you can then route further into a group channel, which maybe is called "Mid Depth". 2-4 such different depths are usually enough for simulating a symphony orchestra convincingly in depth.

    If you have a single instrument that claims a BREEZE2 for itself, I would insert an effect before BREEZE that allows you to set L <> R. Such stereo helpers usually provide the DAW. But there are also many freeware tools. This ensures that the signal naturally appears in the room. Setting L<>R-position after the reverb would sound not natural because the reverb (room-information) would be panned too.

    If you then use a BREEZE in the output channel, which adds the final "tail over all" to the depths, it will also glue the whole mix together.

    What I just described you will find here: (see at "reverb concept") Instead of convolution reverb you now also can say "Breeze2" ;)

    pipedr likes this.
  4. Andrew Souter

    Andrew Souter Active Member

    Mar 18, 2009
    Cool info Beat. :2thumbs:

    We've been having a look at this topic ourselves too. :geek:
  5. hdsmile

    hdsmile Member

    Dec 12, 2015
    thanks Beat for the great tutorial, so I would like to try to combine EW Spaces 2 and Breeze 2 to get more natural orchestral sound, therefor I need your professionals advise. I don't know if it's even a good idea to use both convol. and algo reverb and it's worth to try it, or there is a better way?
  6. Divico

    Divico Senior Member

    im not beat but I can answer imo. Its common practice here to fuse both reverb types. Often people use convolution verbs or just the early reflection part of them for positioning your instruments. The algo verb is than used as a glue on e.g. The master bus. This technic is good for a big hollywood sound. Similar things often happen with real recordings. Engineers take recordings with real reverberation and slap some verb on top of it like the Bricasti
  7. hdsmile

    hdsmile Member

    Dec 12, 2015
    it would be nice to find out a bit detailed on this part, how to setup Spaces 2 and how position instruments... and then what settings to use on Breeze 2.
  8. OP
    Beat Kaufmann

    Beat Kaufmann Active Member

    Hello smile
    I'm sorry that I'm late for an answer. I had a lot to do in my studio. Customers wanted their recordings before Christmas.

    Convolution reverbs work with recordings of natural spaces, whereas algo-reverbs - as the name implies - simulate spaces using algorithms.
    Take advantage of both systems
    Because so far good impulse responses have simulated a natural-sounding acoustic distance better and more natural than algo-reverbs, many use a convolution reverb for this process.
    The disadvantage of convolution reverbs is that they usually statically convert the music over the same IR in nature but the tail depends on a part of coincidences, so it does not happen statically. And here are the algo-reverbs used. Many are able to vary the fading of the music with a random generator, which leads to very natural decaying Hall signals.

    Although the formation of space with two different Hall types is more complex, it can lead to very natural overall results. See >>> here as well.

    Besides EaReverb2 and Breeze2, I do not know any other algo verb with which you can push instruments into the depths so nicely. That's why I made this demo with Breeze2.

    By the way: With Convolutionreverbs you have to listen to the IR-Libraries well. Not each IR does a good "pushing-job". Find those IRs that can push the instruments farthest back and "discolour" the sound as little as possible. I would not pay attention to the names of the IRs but only to the sound and the effect.

    In this context, I recommend this fundamental article on reverb yet to read.


    Of course, it is not necessary to create a separate room depth for each instrument. It is usually sufficient to create 2-5 different room depths - depending on orchestra size. A good way is to make 4-5 bus channels. Everyone simulates their own depth of space with the necessary plug-ins. After that you are able to send, for example, all the woodwinds through the bus that offers room depth 2. Through depth 1 you send all the strings, the brass by 3 and the percussion by 4.
    See >>> these videos (1, 2, 3)
    BTW: Left and right are adjusted with the balance of each instrument.
    It should be kept in mind, however, that instruments that sound further away sound more mono and that they sound less brilliant than those that play at the front of the virtual stage. If you want to do all these things well, it is not as easy as it looks at first glance ...

    I hope that helped so far.

    All the best Beat
  9. hdsmile

    hdsmile Member

    Dec 12, 2015
    Dear Beat, I'm very grateful for your detailed response, thank you very much! I always read your comments because they are very close to me in contents. I need a bit time to digest this valuable information:)

Share This Page