What's new

How much out of phase is okay?

mrd777

Active Member
Hi everyone,

So I'm checking on cinematic studio strings and ozone shows it's out of phase on their samples. I'm sure this is because of the baked in hall sound, not the actual instrument that is mainly heard.

That leads me to ask, how much out of phase (below the 0 point) is okay for you? Is it just a matter of personal taste for when you bring it to mono, it still sounds decent enough to your ears?

What is your thought process when it comes to how much near the negative phase your track, or certain instruments can be at? Perhaps there are some non-important elements which can live in stereo, and you don't care about them much in mono?

Just looking for general thoughts on how *you* handle phase.

Thanks!
Dave
 
When you say "out of phase" do you mean that you are seeing the channels are uncorrelated, or are you seeing them inversely correlated?

I think if a meter is showing things getting uncomfortably close to inversely correlated, then you'll definitely want to check the mix in mono and see if it sounds like what you want, and at the levels you want.

For me personally, I think listening for phase issues is more important than the meter on a master bus, because the the meter would only show phase properties of the entire mix. One instrument that may have phase problems wouldn't show up on the bus meter.

I'm not an expert but here are some thoughts about phase issues I've experienced. These days I always take a few passes over my songs with the specific purpose of listening for these phase issues, listening in both stereo and mono:

(1) sometimes there are comb filtering / chorus-like artifacts. If they are audible in mono but not stereo, it may be caused by delays between L and R, or caused by spaced microphone IRs. One time I had used true stereo convolution reverb, switching it to mono in, stereo out solved it. Other times happened when I've tried to use delay panning of any kind which can cause small delays between L and R channels that sounds like a subtle unintended chorus or flanger.

One caveat is that sometimes room acoustics can hide such phase issues. I found that these issues can show up best in mono with headphones to remove the room from the equation.

(2) sometimes there phasey problems were not related to L vs R. It's happened to me when I tried doing elaborate bus routing for parallel FX, and maybe plug-in delay compensation didn't work correctly. One other time this phasey sound happens is when double tracking it copy pasting with virtual instruments, like layering drums or when crossfading between sampled layers. Solution is to make the multi tracking more real and authentic, i.e. recording separate midi performances when possible, avoiding copy paste, or changing the instrumentation/samples.

(3) Weird volume differences when played in mono compared to the stereo mix. Usually happens with a centered solo instrument like voice or guitar, when I tried to add some stereo processing to it. So that instrument actually becomes noticeably louder or quieter in mono. Often times I've just tolerated this, because I strongly preferred the stereo image I got from the fx. So I just mixed it a little on the quiet side as a compromise. Other times I didn't worry about messing up the stereo image, so reducing the stereo image avoids the dramatic volume difference between stereo and mono.

None of these phase issues really show up on a phase meter...
 
When you say "out of phase" do you mean that you are seeing the channels are uncorrelated, or are you seeing them inversely correlated?

I think if a meter is showing things getting uncomfortably close to inversely correlated, then you'll definitely want to check the mix in mono and see if it sounds like what you want, and at the levels you want.

For me personally, I think listening for phase issues is more important than the meter on a master bus, because the the meter would only show phase properties of the entire mix. One instrument that may have phase problems wouldn't show up on the bus meter.

I'm not an expert but here are some thoughts about phase issues I've experienced. These days I always take a few passes over my songs with the specific purpose of listening for these phase issues, listening in both stereo and mono:

(1) sometimes there are comb filtering / chorus-like artifacts. If they are audible in mono but not stereo, it may be caused by delays between L and R, or caused by spaced microphone IRs. One time I had used true stereo convolution reverb, switching it to mono in, stereo out solved it. Other times happened when I've tried to use delay panning of any kind which can cause small delays between L and R channels that sounds like a subtle unintended chorus or flanger.

One caveat is that sometimes room acoustics can hide such phase issues. I found that these issues can show up best in mono with headphones to remove the room from the equation.

(2) sometimes there phasey problems were not related to L vs R. It's happened to me when I tried doing elaborate bus routing for parallel FX, and maybe plug-in delay compensation didn't work correctly. One other time this phasey sound happens is when double tracking it copy pasting with virtual instruments, like layering drums or when crossfading between sampled layers. Solution is to make the multi tracking more real and authentic, i.e. recording separate midi performances when possible, avoiding copy paste, or changing the instrumentation/samples.

(3) Weird volume differences when played in mono compared to the stereo mix. Usually happens with a centered solo instrument like voice or guitar, when I tried to add some stereo processing to it. So that instrument actually becomes noticeably louder or quieter in mono. Often times I've just tolerated this, because I strongly preferred the stereo image I got from the fx. So I just mixed it a little on the quiet side as a compromise. Other times I didn't worry about messing up the stereo image, so reducing the stereo image avoids the dramatic volume difference between stereo and mono.

None of these phase issues really show up on a phase meter...
Hi Shawn,

Thanks for replying.

Actually I just throw an OZone imager plugin on the master to check my phase, and then I solo the tracks one by one to check which ones might be causing issues, and then I correct them using image correction plugin(s).

So when you say you only see the master phase, I am not sure why you can't view the phase of each track alone?
 
Hi Shawn,

Thanks for replying.

Actually I just throw an OZone imager plugin on the master to check my phase, and then I solo the tracks one by one to check which ones might be causing issues, and then I correct them using image correction plugin(s).

So when you say you only see the master phase, I am not sure why you can't view the phase of each track alone?

Yeah good point. I was just assuming the context of putting the meter on a bus was to visualize the entire mix on the meter. You are totally right that it's easy to meter individual tracks too.

But still, my feeling is that visualizing phase of one track still won't tell the whole story. One track can still have wet/dry mixes of various FX, maybe a multi-mic mixdown from the virtual instrument itself, and that's more than enough to make one track appear nicely decorrelated, but then one part of it may still manifest phase problems.

I'm completely unaware of "image correction" plugins, can you name a few? I'd be interested to see what options they have and how they work.

Cheers!
 
Check multiband correlation meter (I recently got MAAT multiCORR for that, but there is also freeware from Voxengo) - it could show exactly at what part of spectrum phase issues are. Still a lot of manual work after, though a bit less guess work.
 
Another thought - I don't think mild negative correlation by itself is a bad thing. Strong negative correlation is of course not desired, because that implies the volume will be severely reduced in mono.

But if things are roughly centered around 0, that just means the L and R channels are decorrelated, and it's OK if it dips into negative.

Here is a youtube video I had uploaded a while ago for some other reasons. but for this conversation it's interesting to look at the phase correlation. In particular, correlation regularly dips into the negative on reverb tails when the instruments are not playing.

And on the other extreme, strong positive correlation could be a bad thing too, depending on the situation. For example, if every other track is mostly decorrelated, except for one track that has strong positive correlation. In mono, the one track that had strong correlation would likely sound +2-3 dB louder than all the other tracks.

Thoughts?
 
Another thought - I don't think mild negative correlation by itself is a bad thing. Strong negative correlation is of course not desired, because that implies the volume will be severely reduced in mono.

But if things are roughly centered around 0, that just means the L and R channels are decorrelated, and it's OK if it dips into negative.

Here is a youtube video I had uploaded a while ago for some other reasons. but for this conversation it's interesting to look at the phase correlation. In particular, correlation regularly dips into the negative on reverb tails when the instruments are not playing.

And on the other extreme, strong positive correlation could be a bad thing too, depending on the situation. For example, if every other track is mostly decorrelated, except for one track that has strong positive correlation. In mono, the one track that had strong correlation would likely sound +2-3 dB louder than all the other tracks.

Thoughts?

Yeah that's a good point about the tracks being relative, otherwise one gets too loud.

When I mention image correction plugins, I'm referring to something like Ozone's imager, where you can adjust width by a range of frequencies. For example, you can make the bass region at 140hz or lower into full mono. Or you can do something like widen all frequencies at 10-12k.

The point is that you can fix phasing issues using a tool like this to target the problem areas.

Also, yes the reverb tails seem to always be having phase issues. I noticed this in the cinematic strings sample library so I started to wonder what degree of being out of phase is okay. It must be OK to some degree, seeing that professional sample library producers are have some out of phase samples.
 
It's very common when recording ensembles in ambient halls to have the phase meter dip in and out of positive and negative correlation. The decorrelation is part of what gives the sound width and spaciousness.

I'd avoid relying too much on meters for analysing every track in your mix for phase 'issues'. Helpful in some circumstances, but generally go with what your ears are telling you. Good headphones can also be helpful in this regard, especially if your speakers are set up in a less than ideal acoueenvironment.
 
It's very common when recording ensembles in ambient halls to have the phase meter dip in and out of positive and negative correlation. The decorrelation is part of what gives the sound width and spaciousness.

I'd avoid relying too much on meters for analysing every track in your mix for phase 'issues'. Helpful in some circumstances, but generally go with what your ears are telling you. Good headphones can also be helpful in this regard, especially if your speakers are set up in a less than ideal acoueenvironment.

Good point. I guess I'll keep switching between mono/stereo to ensure mix compatibility, which would be the main thing to worry about. Ie, listening w/ your ears.
 
Mono compatibility....is KEY to mix translation, because everyone who isn't listening on headphones IS listening to some percentage mono collapse. It's just the way two speakers in a room works. So, getting a recording where there's as little change in levels and frequencies when collapsed is key to that. Otherwise--one stereo will show "the issue" more than another one....thus it starts sounding more different in BAD ways on different systems.

Too many people want to say "use your ears". That's the kind of thing that's fine for an experienced engineer, who isn't likely one asking this question. But, why would anyone KNOW what phase incoherency sounds like? Understand the above? No....check meters. If anything, meters for such things aren't sensitive enough--because things like high frequencies are as "loud" by volume, but if they're cancelling in mono, you WILL know in terms of translation.

So, acoustic piano mic'd in stereo of with three mics always drifts as chords sustain. Samples, not as much....better samples still will because they model or sample the dynamic resonances....anyway--other than that, the answer is "nearly none" or "as little as earthly possible" IMO/E.

If there's tangible phase issues with the recording of SAMPLES....get better samples. Preach it from the highest mountain--SHAME that company into hiring an audio engineer next time.

If I make the assumption that you're starting out here (forgive me if this isn't the case): what your'e listening for is a tangible difference in volume or tone when you collapse to mono. There will always be SOME changes-as is the nature of stereo sound....but, you want to ensure that for example--that all the high frequencies don't cancel from your strings, because it will make your strings disappear or lack definition on some STEREO playback systems. Largely, you want to make sure that the overall balances remain the same. That everything important is fully clearly audible.

I say the last part because I think it's not really helpful to tell people to "check their mix in mono" without some guidance as to what phase cancellation sounds like....and what issues it will cause in translation. It's well intended, and by the book correct to just say "check your mix"....and also "use your ears"---except--I'm not there to say "hear THAT....compare to THIS"....and without anyone showing you that--I'd actually RATHER you look at the graph--unlike many other new digital visualizations, these (goniometers) have been in use in studios for my whole life. There's a reason.

/novelMode off
 
HI Jamie,

Thank you for that insight.

Have you checked any of your sample libraries to see if they dip out of phase? I would love to get your thoughts on something like that. I see the phase dipping below 0 on orchestral libraries such as
Cinematic Studio Strings, and Native Instruments Symphony Series Brass

I'm checking with Ozone Imager, and I'm not adding any plugins or tweaking the library out of the box.
 
I can say I played CSS for all of 10min and wondered why it got released. So, that doesn't surprise me.

There's a lot of "tricks" that instrument makers can use that ruin the mono compatibility in the recording....tricks to change ensemble size by time delaying the same sample and moving it elsewhere in the stereo spectrum will give the immediate impression that two players are "over there"...but, collapse it to mono, and it's likely going to sound like a synthesizer made it--because it won't sound like one player OR two.

On some level, phase gets complicated when you start talking about including room mics of the same source....BUT....engineered (and edited in the making of the instrument) properly, that shouldn't cause an issue. It's always a relationship that's not perfect in terms of collapse--like any stereo mic'd anything....but that meter going below 0 shows there's likely an issue. If they give you any control over "width" anywhere in the UI....reduce it while it plays and see if the phase correlation meter goes back up.
 
I can say I played CSS for all of 10min and wondered why it got released. So, that doesn't surprise me.

There's a lot of "tricks" that instrument makers can use that ruin the mono compatibility in the recording....tricks to change ensemble size by time delaying the same sample and moving it elsewhere in the stereo spectrum will give the immediate impression that two players are "over there"...but, collapse it to mono, and it's likely going to sound like a synthesizer made it--because it won't sound like one player OR two.

On some level, phase gets complicated when you start talking about including room mics of the same source....BUT....engineered (and edited in the making of the instrument) properly, that shouldn't cause an issue. It's always a relationship that's not perfect in terms of collapse--like any stereo mic'd anything....but that meter going below 0 shows there's likely an issue. If they give you any control over "width" anywhere in the UI....reduce it while it plays and see if the phase correlation meter goes back up.
I can definitely turn off "stereo" mics on these libraries to get phasing under control. I was just kind of confused when I saw how these libs were phasing out of the box with their stereo image.

I guess all of this leads me to conclude, they are making the image very wide out of the box in order to sell to less experienced people?
 
I guess all of this leads me to conclude, they are making the image very wide out of the box in order to sell to less experienced people?

Perhaps some do, but most do not. To reiterate, an ensemble captured in a reverberant space using the kind of spaced arrays common to film scoring and some classical work will often dip into negative correlation on a phase meter. It's not necessarily a problem in the grand scheme of things- rely on your ears to tell you that.
 
Perhaps some do, but most do not. To reiterate, an ensemble captured in a reverberant space using the kind of spaced arrays common to film scoring and some classical work will often dip into negative correlation on a phase meter. It's not necessarily a problem in the grand scheme of things- rely on your ears to tell you that.

That makes sense as well. However, it kind of contradicts what Jamie Lang is saying. He said

"that meter going below 0 shows there's likely an issue "

I'd like to what JamieLang thinks about your statement ?
 
I say the last part because I think it's not really helpful to tell people to "check their mix in mono" without some guidance as to what phase cancellation sounds like....and what issues it will cause in translation. It's well intended, and by the book correct to just say "check your mix"....

I guess this was directed at me since I'm the one that said "check in mono" on this thread. I actually agree with the sentiment of not just "saying the right thing but not helping", but I hope my posts were more helpful than that... I actually did try to list out many examples of phase problems I've personally encountered, what it sounds like, and how I fixed it.

That makes sense as well. However, it kind of contradicts what Jamie Lang is saying. He said

"that meter going below 0 shows there's likely an issue "

I'd like to what JamieLang thinks about your statement ?

I'm also interested to hear your thoughts, since I also felt some negative correlation is still totally ok (see my thoughts about it in previous posts)

Cheers!
 
Last edited:
FWIW-I wasn't directing anything at anyone specifically--while you may have said "use your ears" in this particular thread, it's a common refrain on many topics engineering related all over the interwebs. It's akin to "there are no wrong answers in art"--"the song is more important than the sound"....all both true on some level and not helpful to the specific situation of someone wanting to better understand.

I'm more a fan of the old "you can't break the rules until you learn them by heart". :)

I don't think stereo mics won't dip a correlation meter--at least not for any significant time. Not stereo mics on a tree or any typical orchestral recording techniques. Some back of the hall spaced pair of ambience mics might cause issues in the way a modulated digital reverb will--but those are so low in the mix I wouldn't think that in context those elements are enough to drop into the negative.

As to speculation of why devs do it--wouldn't go for the cynical that they're trying to appeal to users who don't "know better"...so much as they're music lovers without much experience in engineering audio...and with making the instruments, because the editing of the samples is ALSO where even if you hire the best engineer in the world and they do perfectly solid phase coherent 6 mic samples --little non uniform shifts in timing during the editing makes it bad-can actually make it worse. Many of these "kontakt devs" are small shops. Less checks and balances. Less budgets for audio engineers.
 
If it wobbles around 0 it means it's usually fine and really wide. If it's consistently in the -1 to 0 range, you might be a little too wide. Sometimes you would always be in the minus but for a specific element. Check the mix in mono and see if something disappears. If something seems to always be in the negative, check on headphones and see if it doesn't start sounding weirdly 2D which is the issue when you start being out of phase.

Of course there's a matter of taste but when it gets too wide you start losing imaging and it just sounds weird, and the space starts almost feeling smaller. If you feel that it's too far.

Again use your ears but these pointers might help.
 
Is it ok to have some particular virtual instruments showing phase between 0 and -1, considering peculiarities of the instrument? I mean instruments that do not intend to be "realistic", like some Labs pads and Olafur Arnalds. They seem to be deliberately out of phase, is that correct?
 
As I keep asking when similiar questions pop up: Who cares about Mono? I don't have a single Mono Device in my Life. Even my smartphone has stereo build in speakers.
 
Top Bottom