# Mac Pro late 2013



## fraz (Oct 29, 2018)

Hi,

With this machine there are two D500 fire pro GPU's. - Will this machine be capable of running resolution 3440 x 1440 34" wide screen? - I'll probably use an LG with or without thunderbolt - as it'll be possible to use display port on one of the thunderbolt busses.

Some clarity from some experienced users would be handy.

I assume all the following connections will work to connect the Mac Pro

mini display port on the mac > HDMI - on the monitor
" " > display port - on the monitor

I'll be looking to use the same monitor on a PC as well - Some feedback appreciated thanks


----------



## babylonwaves (Oct 29, 2018)

fraz said:


> With this machine there are two D500 fire pro GPU's. - Will this machine be capable of running resolution 3440 x 1440 34" wide screen? - I'll probably use an LG with or without thunderbolt - as it'll be possible to use display port on one of the thunderbolt busses.


yes


----------



## fraz (Oct 29, 2018)

OK-thanks I get awfully confused going from one tech aspect to another.

Sometimes display port provides a better connection / resolution - please correct if incorrect!!!

I'm forward planning-I've got 2 x 24" monitors 2560 x 1440 - very useful

From mac pro - mini dp can connect to HDMI or DP - Is there any advantage of HDMI vs display port

What would the limit be for the Fire Pro D500's? - I did look on AMD website but drew a blank

Anyhow it's good to know it will work

With PC hardware GPU etc...I just check the specs to see the supported resolutions - Any useful info most welcome - thanks


----------



## Wunderhorn (Oct 29, 2018)

You can run up to 3 screens at full 4K with the Mac Pro 2013

BTW the Mac Pro has a HDMI plug but I would always prefer Displayport.


----------



## Nick Batzdorf (Oct 29, 2018)

My understanding was that macOS doesn't support 4K at 60Hz over HDMI, only over Displayport.

Has that changed?


----------



## Nick Batzdorf (Oct 29, 2018)

I guess it depends partly on the model:

https://support.apple.com/en-us/HT206587


----------



## fraz (Oct 30, 2018)

The Mac Pro I got has the D500 GPU's - I think I'd favor one big display of 34" 3440 x 1440 or the 4 k equivalent when they are available - I suppose a 34" is like 2 x 17" - but with 3440 x 1440 there would be a lot of screen space with higher resolution.

I suppose thunderbolt 2 could be used - (LG) or display port connection - Would the Mac Pro handle this? - this would be one connection, presumably only on one of the two GPU's - Fire Pro D500

Based on comments above about 3 x 4k displays - The 3440 x 1440 34" is lower resolution than 4k so it should work? - please comment - thank you


----------



## Saxer (Nov 1, 2018)

Works fine here


----------



## Wunderhorn (Nov 1, 2018)

fraz said:


> Based on comments above about 3 x 4k displays - The 3440 x 1440 34" is lower resolution than 4k so it should work? - please comment - thank you



Yes, it will work. I am running a 40" 4K screen at 3840 x 2160 on display port at 60Hz and a 30" with slightly less, but soon to be replaced by another 40+". You can run 3 of those 3840 x 2160 screens.

You cannot have 60Hz with an HDMI connection on the Mac Pro, they are too old.


----------



## samphony (Nov 2, 2018)

I use one 40” 4K and one 32” 4K Display with my Mac Pro without any problems. There is no need to use the hdmi port 1.4 other than to attach a screen for video playback. 

But if you are in need to use hdmi 2.0 which supports 60hz at 4K you can get a DisplayPort to hdmi 2.0 Adapter. But again that is not necessary as most displays come with DisplayPort built in.


----------



## Nick Batzdorf (Nov 2, 2018)

Nick Batzdorf said:


> My understanding was that macOS doesn't support 4K at 60Hz over HDMI, only over Displayport.
> 
> Has that changed?



The answer is Yes, it's changed if you have the right card.

Under the assumption that I'm going to need to update to the latest macOS sooner than later and it's only going to get more expensive, I picked up an MSI Gaming Radeon RX 560 128-bit 4GB GDRR5, which is the least expensive video card Apple approves to run macOS 10.14 Mojave on a 5,1 Mac Pro.

I'm running 4K @60Hz on my second monitor over HDMI as I post this. It's sort of academic, since the monitor is a TV hanging five feet away and I only use it at 1080p. But it can do it.


----------



## Nick Batzdorf (Nov 2, 2018)

Hah. It also runs [email protected] on my 30" Apple Cinema Display vs. its 2560 x 1600 standard max resolution (over the DVI-D connection).

The picture is too small for my taste, but maybe it'll come in handy sometime.

Coolio.


----------



## Nick Batzdorf (Nov 2, 2018)

Update, since everyone is waiting with bated breath for this post:

You have to do some finagling to force the 30" Cinema Display into 4K resolution. It involves using SwitchResX instead of the built-in control panel, and you have to set the second display to 4K first. Then you can switch the Cinema Display to 4K and the second one to whatever you want.

But it is kinda cool. I'm going to try working with my main monitor at 4K, moved closer, and see how I like it.


----------



## Dewdman42 (Nov 4, 2018)

I will very much like to hear how you like 30" at 4k. I'm trying to decide whether to get a 32" 2k or 4k monitor and it really depends a lot on how close it needs to be to my eyes at 4k to be usable.

So I just also want to confirm about the RX560 card, I'm going to get a card pretty soon. The RX560 sounds like its probably fine for me too, especially considering the lower power consumption, less fan noise and lower price..

I could care less about gaming, but I will want 2-3 monitors...and just want to make sure the RX560 has plenty of power to handle all the redraws on three monitors without any lag or anything...is there any reason to think the 580 would be needed, other than for higher frame rates for gaming?


----------



## Wunderhorn (Nov 4, 2018)

For 4K I would recommend a 40" (resp 43") display over anything that is smaller. Don't be afraid of the screen size. Once you have it you will never want to go back.


----------



## Nick Batzdorf (Nov 4, 2018)

Dewdman42 said:


> I will very much like to hear how you like 30" at 4k. I'm trying to decide whether to get a 32" 2k or 4k monitor and it really depends a lot on how close it needs to be to my eyes at 4k to be usable.



The picture is very small at 4K - usable, but I prefer the larger native resolution of my monitor (even though there's less real estate).

However, you can set any monitor to lower dot pitches, i.e. you'll certainly be happy if you set a 32" 4K monitor to 2560 x 1440 (or maybe x 1600 - that's the 16:10 native ratio on the 30" Cinema Display, vs. most are 16:9).

Dot pitch calulator here. About .25mm is good.

The question is how big a monitor you want. Read what I say below about my experiment with a 40" monitor.



> but I will want 2-3 monitors...and just want to make sure the RX560 has plenty of power to handle all the redraws on three monitors without any lag or anything...is there any reason to think the 580 would be needed, other than for higher frame rates for gaming?



No lag with two monitors set to 4K. It would almost certainly drive a third no problem, but you may need adaptors - it has one each: DVI-D (dual-link DVI), HDMI, and DisplayPort.

In any case, I'd suggest making sure you can return the card if it doesn't work for you. But I think it will.



Wunderhorn said:


> For 4K I would recommend a 40" (resp 43") display over anything that is smaller. Don't be afraid of the screen size. Once you have it you will never want to go back.



Well, that's subjective. I set up my 40" Samsung last year, and the screen itself was too big for me. The left edge, where a lot happens, was too far, and I just didn't like turning my head that much to see things. You'd think there's no such thing as too much real estate, but for me there was.

I'm much happier with the 30" monitor; I now use the 40" one at 1080p resolution 5' away from me to display things like plug-in interfaces (it also functions as a TV).


----------



## Dewdman42 (Nov 4, 2018)

Nick Batzdorf said:


> The picture is very small at 4K - usable, but I prefer the larger native resolution of my monitor (even though there's less real estate).


How far away is the screen from your eyes?


----------



## Nick Batzdorf (Nov 4, 2018)

It's about 2' away, and I never strain to read text (at 2560 x 1600).

The other factor is that I'm fortunate to have good eyes - no reading glasses or anything.

Thinking about it more, there's another factor: a 32" 4K monitor may look better at 4K. But even my 12-year-old Cinema Display isn't blurry at 3840 x 2160 4K res.


----------



## Dewdman42 (Nov 5, 2018)

Thanks for that info. I'm really surprised to hear the 30" cinema display can do the 3840x2160. That implies it was always a 4k monitor to begin with. Very interesting.

I got in a nerdy mood last night and put together a spreadsheet which I make no claims about scientific accuracy, but you still might appreciate it.....

The idea is, how close does the monitor need to be to avoid squinting and how far to avoid head turning, what is the sweet spot distance, and what is the range of that sweet spot viewing distance to avoid those problems.

Some people may tolerate more head turning and some people may have better bionic vision then me.

But I came up with this:






So first notice something, the 30inch cinema display, at 2k is just about as perfect as it can get in terms of balancing the two issues of minimum PPI needed to avoid squinting and maximum size to avoid head turning. The 32" 2k is not bad either, and what is currently offered. That's probably my first choice at this time. Notice that you can put a 30" monitor 20 inches away from your eyes with still hardly any head turning and it can be as far away as 30-32 inches without any squinting. That's a great practical distance with great range, a foot of tolerance for the sweet spot.

As it turns out, none of the 4k monitors really work out very well in terms of being in the sweet spot of having PPI low enough to avoid squinting and also a small enough monitor to avoid head turning. Its always a compromise, you have to either put it a few inches closer to avoid the squinting, with slightly more head turning...or you have to put it a little further to avoid the head turning, at the expense of slightly squinting. and the range of ideal view distances is severely constrained because of that, whereas the 2k monitors can work within much larger range of viewing distances without any problems. For 4k monitors there is pretty much one exact viewing distance that will be as close to the middle of that sweet spot as possible...not even perfect.....but moving the monitor closer or further away will make one of the two issues more pronounced. If you're ok with squinting a little or ok with turning your head a little more, then it might work out ok for the benefit of more desktop real estate.

So in general, I am questioning 4k monitors as being useful for real computer work, with fonts and what not. I'm sure they look absolutely gorgeous for playing games and watching movies...possibly for video or photo editing. Definitely NOT for reading fonts on the screen. And I suspect most people will probably put their face closer to the screen for reading fonts and just turn their head more. Because apps like Logic and FinalCut really do benefit a lot from having more space to layout mixers, plugin windows, timelines and so forth.

In general its just the fonts that are the problem, so really if the OS adjusted font sizes up a bit for 4k monitors so that we get the lovely sharpness without squinting to read stuff, then that would probably be the ideal for the future, however I think that is not that simple as many apps are reliant on things lining up the old way and assuming it will all be relative.

30" or 32" at 4k just doesn't work in my view, your eyes have to be 18-20 inches away from the screen to avoid squinting, like a laptop would be maybe. However, they do look gorgeous in terms of sharpness, particularly images and what not will look very nice, even fonts from up close will be sharp like a retina display...very gorgeous. But I would have to squint at 2 feet away or more. I don't view that as practical for DAW situations. But if you can arrange the monitor to be right around 20-21 inches from your eyes, it actually works out ok too, actually better than 43" 4k at 2.5 feet. But if you need it to be more than 20 inches from your eyes, then 43" at 2.5 feet is the next best option.

The 43" 4k rings out as the next possible monitor to use at approx 2.5 feet away, but will result in SLIGHTLY more head turning then the 30" and 32" at those distances. As noted, it needs to be pretty close to 2.5 feet away to hit the sweet spot, with very little variance. And even there, it may involve some slight squinting or slight head turning. Might be worth it for all the real estate...still not sure.

I submit that a 3K monitor should have been invented. 

I've attached the Excel spreadsheet in case it helps anyone.


----------



## Creston (Nov 5, 2018)

Nick Batzdorf said:


> It's about 2' away, and I never strain to read text (at 2560 x 1600).
> 
> The other factor is that I'm fortunate to have good eyes - no reading glasses or anything.
> 
> Thinking about it more, there's another factor: a 32" 4K monitor may look better at 4K. But even my 12-year-old Cinema Display isn't blurry at 3840 x 2160 4K res.



Apple's 30" screen can do 4k?!


----------



## Nick Batzdorf (Nov 5, 2018)

Dewdman42 said:


> Thanks for that info. I'm really surprised to hear the 30" cinema display can do the 3840x2160. That implies it was always a 4k monitor to begin with. Very interesting.



No, it's a 2560 x 1600 monitor that's capable of displaying 3840 - which is a distinction that probably has a difference. That's what I meant when I said that real 4K monitors probably display 4K more clearly.

For reference, I'd have to sit 12" from this monitor - ridiculously close - to be sort of comfortable with the fonts at 4K resolution. Again, that's *this* monitor.



Creston said:


> Apple's 30" screen can do 4k?!



It can if you use SwitchResX, which "unlocks" resolutions the monitor is capable of displaying but the system doesn't offer.


----------



## Nick Batzdorf (Nov 5, 2018)

Dewdman42 said:


> I got in a nerdy mood last night and put together a spreadsheet which I make no claims about scientific accuracy, but you still might appreciate it.....



The X factor is that we're not all the same person. Lots of people use 40" or larger TVs at 4K and are happy - for example Wunderhorn on this page. Just because I don't like it doesn't mean you won't!

A couple of other points:

- The edges of my monitor seem to be about 40˚ from my nose (I picked up a clear protractor, gripped it where I see the edges).

- I checked, and I normally move my nose 1/4" to see the edges of the 30" screen. That goes up to about 1" for a 40" one. So this is pretty subtle!

- Samsung and others sell 27" or maybe 28" 4K monitors that are intended to be used for computers. I've seen one at Costco for under $300. My hunch is that they display fonts very well.

- My 40" Samsung TV - not a computer monitor - has very slight color blurring (whatever the technical term is) around the edges of letters, while the Cinema Display doesn't. The Cinema Display is generally easier on the eyes at computing distances - its color temp is warmer, for one, but as with some speakers, it just generally gives you the overwhelming feeling of being right.

That's not surprising, because this monitor was $2500 when I bought it 12-1/2 years ago, plus I had to buy an expensive Gefen box to extend the DVI-D cable.

And it was a bargain. I hope it lasts forever!


----------



## Nick Batzdorf (Nov 5, 2018)

Dewdman42 said:


> I submit that a 3K monitor should have been invented.



You can run 3008 x 1692 or 3360 x 1890 resolution on a 4K monitor, and I'm sure it'll look great. Both are totally workable on my 30" monitor - but those are 16:9 resolutions (like any monitor you get today), while the 30" Cinema is 16:10, so it has black areas at the top and bottom.

Again, SwitchResX is your friend.


----------



## Dewdman42 (Nov 5, 2018)

Nick Batzdorf said:


> The X factor is that we're not all the same person. Lots of people use 40" or larger TVs at 4K and are happy - for example Wunderhorn on this page. Just because I don't like it doesn't mean you won't!


well like I said, some people are willing to turn their head more or squint more for the sake of real estate. That is a personal preference of course. My post was nothing about "liking" or not..and only about actual facts related to visibility and head turning due to peripheral vision. Everyone is of course free to squint more or turn their head more as they wish in order to have more pixels on the screen.

The truth is that 4K has a very narrow margin to get both of those two diametrically opposing factors in good shape. This is actually quite evident from the data. How different people compromise those points is an individual matter.



> A couple of other points:
> 
> - The edges of my monitor seem to be about 40˚ from my nose (I picked up a clear protractor, gripped it where I see the edges).



You can calculate the angle using trigonometry and using the actual distance from your retinas to the screen and the width of the monitor. No angle measuring required. When I place my eyes about 18 inches from my 24 inch monitor i felt that was about as close as I would want to be in terms of having to turn my head...and the trig gave me 32.52 degrees. You can plug a larger value into the spreadsheet to see the results if you think you would fine with more head turning then that.

If you are two feet away from your 30" I do not think its 40 degrees...according to the trig...but hey maybe you don't mind turning your head as much as I do. By the way, the measurement I used is the angle from nose to one side, not both sides. So if you measured from your nose to BOTH sides with your protractor, that would be a 20 degree number entered into the spreadsheet I shared earlier, which makes sense at 2 feet away from 30" cinema display.



> - I checked, and I normally move my nose 1/4" to see the edges of the 30" screen. That goes up to about 1" for a 40" one. So this is pretty subtle!


I don't see 1" as being much of a problem. Probably moving up and down is more of a problem due to neck strain. I don't think your situation is 40 degrees if you're only moving your nose an inch. I love the experimentation though! 



> - Samsung and others sell 27" or maybe 28" 4K monitors that are intended to be used for computers. I've seen one at Costco for under $300. My hunch is that they display fonts very well.


Sure they do and you can read them if you are 12-18 inches from the screen or using larger then standard fonts. Also keep in mind that a lot of people buy 4k monitors for gaming. In gaming, fonts don't matter..its all about video image quality and 4k definitely helps there. Also, 4k is a market buzzword.



> - My 40" Samsung TV - not a computer monitor - has very slight color blurring (whatever the technical term is) around the edges of letters, while the Cinema Display doesn't. The Cinema Display is generally easier on the eyes at computing distances - its color temp is warmer, for one, but as with some speakers, it just generally gives you the overwhelming feeling of being right.


I agree, that was one fine monitor in its day and still is..the only complaint I have seen raised about it is slow refresh times, which results in a bit of ghosting when you drag windows around, compared to newer models today. I don't have one to try myself.


----------



## Dewdman42 (Nov 5, 2018)

Nick Batzdorf said:


> You can run 3008 x 1692 or 3360 x 1890 resolution on a 4K monitor, and I'm sure it'll look great. Both are totally workable on my 30" monitor - but those are 16:9 resolutions (like any monitor you get today), while the 30" Cinema is 16:10, so it has black areas at the top and bottom.



yes I think many people that buy 4k monitors in fact do end up having to run at a lower resolution which is not always as clear. That is exactly my point!

Your monitor will be sharpest at native resolution. Anything else is going to be fuzzier to some degree. But the 5k monitors have so much resolution that they can be downgraded in resolution and it will look good. Maybe 4k are also pretty clear when downgrading the resolution I can't say because I don't have one to try, but when you do that, technically, you are going to be compromising the picture.

But I agree, some of those in-between resolutions would be very useful on a 4k monitor if you're already there.


----------



## Nick Batzdorf (Nov 5, 2018)

By the way, with SwitchResX I can run the 30" monitor at 4608 x 2592. Not much use for it, but I could if I wanted to impress chicks.



Dewdman42 said:


> Your monitor will be sharpest at native resolution



I'm not 100% sure about that, at least not at every resolution. Doesn't it just scale everything (if the ratio stays the same)? The dot pitch stays the same.


----------



## Dewdman42 (Nov 5, 2018)

the reason its causing fuzziness is exactly because the dot pitch stays the same. Technically if you downsize to 1/4 the resolution, turning every pixel into 4 pixels representing what used to be one pixel...then it would not be fuzzy...but would be very jaggy and low resolution of course. but anything in between doesn't divide equally...so you get fuzzy noise due to that. That's probably an over simplification, but you get the point.

That's very interesting that your cinema display is able to shrink it down and create a pseudo 4608x2592. I agree its probably still cramming those into actual 2560x1440 or whatever actual pixels. I have SwitchResX and my monitor doesn't do that. Might be a feature of your actual video card to be honest, not so much the cinema display.


----------



## Nick Batzdorf (Nov 5, 2018)

It's the video card. I know because the stock Nvidia won't do that. The display will show whatever it's fed.

And yeah, the resolution is limited to the dot pitch of the monitor, but if you shrink the picture it doesn't get what I'd describe as blurry (it does if you enlarge it). I suspect a 4K monitor would be sharper, however.


----------



## Dewdman42 (Nov 5, 2018)

if you shrink it down to fit more resolution on your desktop, its just that some pixels are getting left out. blurry would not be the right word, I agree. Rounding errors would cause missing information as some pixels would round down to zero.

For upscaling the other direction (i.e., using a lower resolution setting in SwitchResX), then the opposite will happen, there will be rounding errors and some areas that are supposed to be say 1.5 pixels wide, for example, after upscaling will be 2.0pixels wide, etc.. Won't cause loss of information but will cause rounding errors, some stuff would round down, some would round up. Nothing would be missing because everything is being enlarged, but some stuff would round down so it doesn't get enlarged and other pixels would get enlarged bigger then needed.

The higher the native resolution of a monitor, and in particular the higher the PPI, the more its ability to do nice things with lower resolution fed to the video card, OS X and it, upscaling it somewhere. Simple as that. A 5k monitor would be able to handle all these lower resolutions much better because even though there is quantization noise, the PPI is so high that your eyes still won't see it as much. To a lesser degree that may be the case with 4k, I've never had one to try. My Sony TV is remarkable at upscaling HD content to the 4k TV. But that kind of smarts also means input lag, which is not good on a computer monitor, so I think mostly they don't do fancy upscaling stuff like that unless you're actually watching a video..then its done.

My 1920x1200 monitor definitely looks fuzzy when using SwitchResX to lower the resolution. Its still usable, but I'd much rather not use those modes. Even with a 4K monitor, while it would certainly be closer and nicer in its ability to upscale stuff then mine is, it still will have that noise and will not be the same crisp and clean output as is the case with native. Might be perfectly acceptable for many people, I can't really say. I'd prefer to figure out which monitor I can run native, hence my spreadsheet. But I guess I will never know what its like to upscale to 4k from 3k resolutions until I get an actual monitor and try it.


----------



## Nick Batzdorf (Nov 5, 2018)

Dewdman42 said:


> My Sony TV is remarkable at upscaling HD content to the 4k TV. But that kind of smarts also means input lag, which is not good on a computer monitor, so I think mostly they don't do fancy upscaling stuff like that unless you're actually watching a video..then its done.



I've long suspected that upscaling is the culprit that causes motion blur when you're displaying 1080p sports on a 4K TV. You're the first person to mention input delay, and I'm sure it's the same thing.

We replaced a broken TV with a 1080p Vizio about three years ago, and it's fantastic. So I bought another Vizio to replace an old 480p plasma TV in my office/studio, the new one 4K, and I had to return it to Costco. There's no way you could watch a basketball game on it.

The 4K Samsung is a lot better, and actually really good, but still not as good for sports as the 1080p Vizio. I had no idea what motion blur is, and might not have noticed it, had it not been for the 4K Vizio.

By the way, there are tricks to talking SwitchResX into showing more resolutions than it wants to show. It now has a 7680 x 4320 option! But that's almost a thumbnail, and I had to force-restart to return to normal resolution.

Fun with monitors. Yipee!


----------



## Dewdman42 (Nov 5, 2018)

hehe cool...

Most TV's these days have a game mode that will basically turn off a lot of the fancy video processing in order to get more responsive results. Its meant more for video games. But could be useful for watching sports events too. But they usually have fancy video processing modes that actually work great with sporting events and make everything look clearer and smoother, but make Movies and TV shows look like 1970's soap opera. So you kind of have to figure out the mode that will support sports..

There are two delay aspects of the processing, there is input lag and there is response time. 

The response time is pretty much baked into the TV I think. Most of the computer monitors now have 5ms response time, sometimes less. That's how long it takes to refresh the pixels. The Cinema Display I think has 12ms or 15ms, something like that. So it can't always keep up if you drag a window quickly across the screen, there will be some ghosting as the pixels try to paint themselves and unpaint themselves. Anything that has a lot of motion on screen will fare poorly with slow response times. Most modern monitors have it down to less then 10ms, and some really low like 5ms, which gamers consider essential.

Input lag has to do with all the fancy video processing that is built into the TV for upscaling and detecting frame rates, dealing with film frame rates vs TV frame rates, color correction and all kinds of other complicated stuff and all that stuff introduces latency. It doesn't really matter when you're watching TV that you have latency. I mean who cares if the actual picture is half a second behind when it came from the satellite, as long as the audio is also delayed appropriately, which it usually is. But if you're trying to use your mouse with a monitor, you kind of don't want any lag at all. You want the screen to respond to your typing, mousing and to what OS X/Windows is telling it to do, pretty much immediately. So all those fancy color correction, upscaling genius and on and on..needs to be turned off when using a display as a computer monitor. That's why if you try to buy a big tv as a computer monitor it may or may not work that well, depending on options it has to turn all that stuff off, among other things.


----------



## Nick Batzdorf (Nov 5, 2018)

Believe me, I played with all the options on the Vizio - and with the Samsung, for that matter. Game mode, sports mode, cinema mode, whatever.

The frame rate spec they publish means exactly nothing, at least with TVs in the $300 price range.


----------



## Dewdman42 (Nov 5, 2018)

Nick Batzdorf said:


> You can run 3008 x 1692 or 3360 x 1890 resolution on a 4K monitor, and I'm sure it'll look great. Both are totally workable on my 30" monitor - but those are 16:9 resolutions (like any monitor you get today), while the 30" Cinema is 16:10, so it has black areas at the top and bottom.



Alright, here's some more info about this possibility. First let's talk about Apple Retina displays. The concept behind the Retina display is that Apple decided that at about 10-12 inch viewing distance, a person with 20/20 vision can't detect pixels with 300ppi or better. 20/20 is not actually perfect version, but a lot of people use their iPads and iPhones from more than 12 inches also, so it probably generally works out that when you use a Retina display at 300ppi, you won't see any pixels. Any display Apple makes with the "Retina" label is supposed to conform to that concept, though there some of them that are even as low as 225ppi, but if you were viewing from further away then 10-12 inches then it would still qualify...

I have read some scientific articles that explain the retina and its basically true what Apple is saying, Anything over 300ppi, when viewed from a foot away, you will not see pixels.

Ok.. So the point is, when you upscale your display by dropping the resolution, there are going to be some rounding errors as we said, but if the monitor has a high enough PPI and the viewing distance is great enough, then we can't see the pixels anyway.

Very few displays qualify as Retina displays with 300dpi. The 27" 5k monitors do. But most retina displays are iPhones, iPads and stuff that sits close. A 32" monitor or 43" monitor sits far away. So the question is, at what distance and PPI will a display become essentially a Retina display by virtue that the eye can't see the pixels.

Here's a web page that does the math: https://www.designcompaniesranked.com/resources/is-this-retina/

Let me boil it down fro you. A 32" 4k monitor effectively becomes a Retina display when its at least 25 inches away from your eyes. A 43" 4k monitor effectively becomes a Retina display when its at least 34 inches away from your retina. (they unfortunately aren't recognized by OS X as Retina displays though for adjusting font size and other things the cool Retina way).

So what does this mean... This means that if you have a 32" 4K monitor, you can use a slightly lower resolution, say the 3K resolution we were talking about before, and as long as the monitor is 2 feet away, even a person with perfect vision will not see the jaggies. 

This actually does open up the concept for me of using a 32" 4k monitor rather then getting an old school 2k one. I could, for example, run it at 3200x1800, from 25 inches away and the fonts will be big enough to read, the monitor will not be too big for head turning and it will be far enough a way that my eye can't see the pixels anyway.

The 43" would need to be 3 feet away to not see the jaggies, but basically that also lines up with the 3200x1800 resolution. 

And you'd always have the option to crank it up to full 4k if you're reading a lot of fonts that day. Or get a swing arm to bring it in a little closer for reading at full 4k resolution.


----------



## Dewdman42 (Nov 5, 2018)

The ultimate, which will be here eventually, but we aren't there yet, is 8k displays. A 32" 8k display ($5000 + more powerful video cards), basically would provide Retina capability, even as close as 12 inches. That is 7680x4320 which you would probably never be able to run that high from more than 12inches of viewing distance, but as long as the display is at least a foot away from your face you could run it any resolution that suits you below that. A 43" 8k monitor would only need to be 17 inches away to hit visual accuity bliss...

I see it as being quite a while before we have that..we are stuck with our grizzly 4k monitors for the time being...but once we have 8k, that is when our 32 and 40 inch monitors will finally look as gorgeous as Retina.


----------



## Dewdman42 (Nov 5, 2018)

But wait there's more. Check out this system tweak to get OS X Retina preference settings with third party displays: 

https://www.macobserver.com/tips/how-to/4k-monitor-retina-mode/

So what does that mean? It means: put the 32" 4k monitor at 25 inches if you want to not see pixels. Then run it at full 4k, but use the system preferences to just make the fonts a little bigger.

we likes it... 

or the same would work with the 43" at 34inches away. I haven't had better then 20/20 vision for years, so I could probably have either monitor a little closer if I want, but see above about head turning distances.


----------



## Nick Batzdorf (Nov 5, 2018)

I like the option to display larger text, but the Monitors panel doesn't have that option in 10.13.

Anyway, I read what you posted with interest, but there's one overriding detail: the lowly 30" display looks absolutely fine at 2' away, regardless of resolution - and I wouldn't want a monitor in my face only 10" - 12" away!

I was skeptical of the Retina displays from the beginning, because they solved a problem that I don't think exists.

Meanwhile, 3360 x 1890 is actually very usable, even though there's about 1" of black on the top and bottom (because it's 16:9 on a 16:10 monitor). I've been working at it like this for an hour, and I'm likely to use it for sequencing. 32 bars/57 tracks vs. 45 bars/70 tracks at the size I have Logic's Main window set up.


----------



## Dewdman42 (Nov 5, 2018)

Yea I wasn't suggesting that you or anyone would want a monitor 10-12" away, not sure how you got that. That is how Apple explains Retina classification. Its just a number. It says that at 12" the PPI needs to be 300ppi. At further distances it can be a lower PPI value and still qualify as "retina". That just means you can't see any pixels, which is nice for sure, but not an absolute. But its nice and if you have the monitor at a Retina qualifying distance...which you would be with a 32" 4k monitor at 2 feet, then you don't even have to worry about the pixilation from lowering the resolution. See what I mean?

However...

The main advantage of using OS X in retina mode, is that the overall resolution can be the full 4k, but fonts and windows and stuff like that will be showing like they are at 3360x1890 or even bigger if you want. Graphical elements, images, window edges and borders and everything else would be showing at the full 4k resolution, 139ppi...which from 2 feet away would be crystal clear and very "retina" like. Fonts would be bigger because of the Retina adjustment, but still possessing 139ppi clarity...razor sharp fonts to read...like on modern iPhones.

did you try to tweak page I sent you to? For 3rd party monitors you have to make some tweaks in order to enable that...and it probably won't work unless you have a 4k monitor too. I do not think it has been left out of 10.13, unless they removed the ability to tweak it for 3rd party monitors, which would suck, but also wouldn't surprise me. This would be very good to know if Apple has removed this possibility for 3rd party monitors.


----------



## Dewdman42 (Nov 5, 2018)

oh crap, never mind..this sucks... you need to have at least a 2013 mac pro, OS X won't let you do it with older mac pros.. 

https://support.apple.com/en-us/HT206587

I can't think of any technical reason why that would be so. That is one of the reasons I get so annoyed with Apple, but I digress..

well that does seriously suck, I was excited about that for a moment. But at least we know that a 32" 4k monitor at 2 feet away will qualify as being "retina" worthy, you won't see any pixels, even if you can't use OS X features to take better advantage of it, at least you could lower the resolution a bit and it should still look ok... I think. I don't really know, now I am back to waiting until I can try it....


----------



## Dewdman42 (Nov 5, 2018)

Here's another trick worth trying with SwitchResX 

if I am understanding this right, then it would mean that a 4k monitor could basically be put into 2k Retina mode..which essentially just be a much sharper/clearer version of a 2k monitor. I do not think this enables the text enlarging features that are available for newer macs...which is really unfortunate as that is what is needed to use 4k resolution from a little further away. That feature is standard in windows FWIW..there is no reason other then greed for apple to block 2012 macs form having that capability.


----------



## Nick Batzdorf (Nov 5, 2018)

It would have to do with the display cards in older Mac Pros. I think that Apple page is out of date now. You can run [email protected] over HDMI in High Sierra, and I couldn't do that when I had an RX 460 (which I'm really glad I returned).

And yeah, understood about Retina displays.

But Retina mode would be nice.


----------



## Dewdman42 (Nov 5, 2018)

Well still I can't think of any technical reason with the font enlarging feature could not be enabled with virtually any video card that is 4k. They are listing off only very specific situations that involve thunderbolt and 2013 mac Pros rather then supporting it generally. Seriously lame by Apple. But I digress..

I think the switchResX trick can probably at least make a 32" 4k usable from further away, and be kind of like pseudo retina display in terms of sharpness and readability... I can't confirm that because I don't have 4k monitor yet. But honestly I wish OS X was able to scale the fonts for me to say in full 4k resolution and just scale up the fonts.. The switchResX trick is something different, that basically takes up 4 pixels for every one pixel. So it drops a 4k monitor to 2k, but with much sharper Retina-like image quality. ok..

And I guess 3200x1800 (or something close to that), would probably look fine from 2 feet out...so there is always that. I'm probably leaning towards just doing that.


----------



## Dewdman42 (Nov 5, 2018)

I tried the hack to run my 1920x1200 monitor in Retina mode. It actually worked, and wow, crystal clear..I can't believe how sharp everything looked, its like a totally new monitor...but...only at 960x600. In other words 1/4 the resolution. But wow, it looked really really good! The Retina technology is definitely something.

So basically if you use that trick with a 4k monitor you don't get 2k resolution Retina, you would be getting 1920x1200 Retina display out of it...and you don't get the option to change the font sizing and spacing like you do if you have a 2013 Mac Pro, you just get everything 4x the size and gorgeous with 1/4 the desktop real estate.

I'm sure it would look amazing, but me personally I'd rather have 3200x1800 non-retina, 1920 is just not enough. For more money I could investigate getting a 5k monitor, but I haven't been able to find one that works without thunderbolt. There is only the Dell 8k for buko bucks that can be driving by dual DVI connections.

its a shame that OS X doesn't provide a way to do the font enlarging WITHOUT full 4x retina mode. But I guess there are not many of thus that need that and they want to sell Retina macs...so...there is that. I digress...


----------



## Dewdman42 (Nov 6, 2018)

Also I just want to say, to cap off this monitor discussion, that using the aforementioned hack to put a system into the HiDpi mode is a very good alternative to just running your monitor slightly reduced resolution. I was not at all aware of this possibility before, but basically doing it the "HiDpi" way will preserve the resolution of line edges and what not, while making things a little bigger. And it turns out there are ways to do it without having to go to 1/4 the size (but not with the vid card I have now). It would probably work with your RX560 Nick. 

In practical terms what this means is that you could put a 4k monitor into 2560x1440 HiDpi mode, and it would look way sharper then if you just put it into a simple 2560x1440. In fact it would look sharper then native resolution too! It would look absolutely sharpest at 1920x1200. which is 1/4 the size of 4k. But its possible with some hacks to make 2560x1440hidpi also available and will also look sharper then native 3840x2160 because of the hidpi factor which is still effectively bringing some of the retina display concept even if not exact. It might even be possible to setup some slightly larger hidpi resolutions then that and still be getting a retina-like experience..maybe not quite as pure retina, but still sharper then native resolution.

This is definitely something I will play around with after I get my own RX560 and 32" 4k Dell, waiting for black friday at this point.

My understanding at this point is that OS X does not make fonts bigger when you use the fancy Retina display pref pane. All it does is adjust which hidpi mode you're using to get various different lower resolutions while keeping the hardware at its native resolution. This results in the sharp retina look and fonts that are big enough to see, on a 4k or 5k piece of hardware. So in general I would say that is a much preferable way to lower the resolution of a 4k monitor in order to read the fonts.. makes things sharper rather then fuzzier. But with 3rd party hardware you have to hack it, Apple doesn't make it easy. Anyway, something to try..


----------



## Nick Batzdorf (Nov 6, 2018)

I know of no way to do the HiDpi or whatever trick on my machine.


----------



## Dewdman42 (Nov 6, 2018)

Try this first.

https://www.tekrevue.com/tip/hidpi-mode-os-x/

You execute the command lines given and reboot, then your list of display resolutions will include some that say "hidpi" on the end. That's it. That's what I did with my old vid card and monitor and definitely worked, but because of the age of my stuff, I could only get a max of 960x600 out of it. so not really usable for me, but damn it sure looked crisp and sharp, I was kind of floored by it actually. You ought to be able to do at least that much with your RX560 and cinema display.

After that, you can use SwitchResX to gain some better resolutions. I have corresponded with the author of SwitchResX last night and he answered my questions, knows all about how to do it.

If you google around you can find some instructions for what to do in SwitchResX also, but basically you go in there and setup some custom scaled resolutions that are double what you actually want. So for example, Let's say your cinema display is 2560x1600 and the maximum hidpi resolution that will probably show up on the list after you do the instructions in that link above, will probably be 1280x800hidpi. If you decide you like that, but you'd really like to run at say 1920x1200 hidpi. You can probably set that up in SwitchResX and then you'll get some kind of Retina-like sharpness at that resolution. Might even be able to get a little more. I'm told that this is only possible with some video cards and monitors, so I can't promise anything until you try it, but lots of people are apparently doing that.

This would be proof of concept only, I realize you probably would rather run your cinema display at 2560x1600. But the RX560 together with a new 4k monitor, it would probably be possible to run in retina-mode at 2560x1440hidpi. Maybe even a little higher. I think it involves basically telling SwitchResX that you want a resolution of double the hidpi you want. So you'd configure it for 5120x2880 or something like that, and reboot...then with the above hack in place, your display resolution list would include 2560x1440hidpi....


----------



## Nick Batzdorf (Nov 6, 2018)

Ah. SwitchResX has all the HiDPi resolutions already, available for both monitors.


----------



## Dewdman42 (Nov 6, 2018)

There ya go. I did understand that SwitchResX is able to set that terminal command for you but I was t able to get it to work here for some reason. Do they change your Cinema Display to retina quality?


----------



## Nick Batzdorf (Nov 6, 2018)

Absolutely. Retina Pro™ quality.


----------



## Dewdman42 (Nov 6, 2018)

What’s the highest hidpi quality toubare able to get with the rx560 and Cinema Display?


----------



## Nick Batzdorf (Nov 6, 2018)

4K - 3840 x 2160.

But HDPI doesn't work right at any of its resolutions. The cursor is off by a few inches.

The non-HDPI settings work right and look better.


----------



## Dewdman42 (Nov 6, 2018)

I would expect that to be the case. The highest hidpi that is below your native resolution is what?


----------



## Nick Batzdorf (Nov 6, 2018)

Really, 3008 x 1892 is the smallest usable setting - not because of the quality but because it's too small otherwise.


----------



## Dewdman42 (Nov 6, 2018)

ok, but what is the best HIDPI setting your are getting? That would something smaller then your native resolution?


----------



## Nick Batzdorf (Nov 6, 2018)

2560 x 1440.

It doesn't look better than the native 2560 x 1600 setting.


----------



## Dewdman42 (Nov 6, 2018)

ok that's interesting, but I still would have expected that since your native setting should be 2560x1600 right? How does the next one down in HiDpi look?


----------



## Nick Batzdorf (Nov 6, 2018)

Yes 2560 x 1600, next one down looks the same.

And I wouldn't have any use for it anyway, because the monitor looks outstanding at its native resolution.

Again, there's more to how good a monitor is than its resolution.


----------



## Dewdman42 (Nov 6, 2018)

no, that is still not retina-functionality. The next one down from that one, with hidpi next to it...is what? 

_This will help me a lot of find out what you're getting in my upcoming purchase decision. _

I would expect to see something around 1900xsomething with hiDPI and would expect it to look like a retina display.


----------



## Nick Batzdorf (Nov 6, 2018)

2304 x 1296.

It doesn't look any better than 2560 x.

Maybe if it had a 16:9 HDPI setting below the native res, but it doesn't.


----------



## Dewdman42 (Nov 6, 2018)

Yes it needs to have an (HiDPI) setting below the native one. If it doesn't, then there is something screwy about your setup, perhaps after all the experimenting you've been doing to run your monitor at higher resolution than it natively does, I have no idea.

There is no point whatsoever in running HiDPI at the same or higher resolution than native. It needs to be lower, let's say at least 25% lower then native, in order to gain the retina technology advantages of HiDPI.

The whole point of HiDPI is to provide sub-pixel capabilities, which only happens when you have an actual "HiDPI" resolution that is lower then the native resolution. Any HiDPI resolutions you see that are higher then your native resolution are mistakes made by SwitchResX I reckon...they are not relevant at all and certainly I would not expect them to look good, in fact they might even look bad. In order to see the Retina look it would need to be lower then native, maybe by 25%, just rough guessing. The default is 50% less.

So are you saying 2304x1296 is or is not HiDPI? If no, then what is the next lower HiDPI resolution to use?

Assuming it is HiDPI, you say it doesn't look any better then native 2560x1600, but if it looks pretty much just as good, without fuzziness, then that is still something to recognize as a good thing. It means that I can get a 32" 4k monitor and run it at say 2560x1440 without fuzzies. In other words, compare 2304x1296 to 2304x1296(hidpi) and which one looks cleaner?

That is the question I'm trying to figure out before I buy a new monitor, whether I would rather just get a 32" monitor at 2560x1440 native, and run it there and be done with it, for low price too, or whether to get a 32" 4k monitor and then run it at a lower resolution in order to make the fonts bigger, but I'm wanting to know whether with the RX560 I can get the HiDPI technology to work at say around 2560x1440...which would then be bigger fonts and no fuzzies, and me very happy. If you were able to get yours to work 1920x1200(or close to that) in HiDPI mode...then I would feel pretty confident about doing what I want to do with the 4k monitor.


----------



## Dewdman42 (Nov 6, 2018)

I think in SwitchResX that maybe if you create a so called custom scaled resolution of around 3840x2400, then maybe a 1920x1200(hidpi) resolution will show up after reboot.


----------



## samphony (Nov 6, 2018)

Nick Batzdorf said:


> However, you can set any monitor to lower dot pitches, i.e. you'll certainly be happy if you set a 32" 4K monitor to 2560 x 1440 (or maybe x 1600 - that's the 16:10 native ratio on the 30" Cinema Display, vs. most are 16:9).


This!


----------



## Dewdman42 (Nov 6, 2018)

I say that if you are going to do that you should just buy a 2k monitor instead, which will be 60% cheaper and will look BETTER at its native resolution then a 4k turned down to 2k..

UNLESS........

If you can get 2560x1440 (hidpi) to work with the 4k, then that will look even better then the 2k monitor at native.


----------



## Nick Batzdorf (Nov 7, 2018)

I'm not sure it will be cheaper, Dewdman42. Pretty much everything is 4K these days.

Costco sells a Samsung 28" or 27" 4K *computer* monitor for under $300.


----------



## Dewdman42 (Nov 7, 2018)

If you are talking about cheap Costco screens then I don’t know what to tell you, but when I look at a well reviewed monitor such as dell32” 2k it’s $350-400 and a dell 32” 4K is more then double that, plus in my case I’d have to update my video card in order to use it as would many, bringing it close to $1000 for a quality 4K monitor.

It is what it is. Sure wish you would tell us whether you were able to get 1900ish working in hidpi mode


----------



## Nick Batzdorf (Nov 7, 2018)

They're not cheap Costco screens, they're good Samsung screens sold at Costco!

I am able to get 1920 x 1080 in hdpi mode (and it's not an improvement). Sorry, I thought I wrote that.


----------



## Dewdman42 (Nov 7, 2018)

Thanks for that! When you say its not an improvement, does it look as good as Native? How does 1920x1080 (hidpi) compare to normal 1920x1080?

There is definitely something a bit screwy about your setup as I don't think you should be seeing any hidpi bigger then native. You have a lot of extra resolutions there, so obviously you've been having some fun with SwitchResX, but that is why you see some hidpi resolutions larger then native, which are really frankenstein resolutions that shouldn't be there.

In order to get 1920x1200(hidpi), which is what would fit your 16:10 monitor, you need to define a "scaled resolution" in SwitchResX of 3840x2400. Then the 1920x1200(hidpi) should appear on your list and it would be interesting to hear a comparison between that one and the normal 1920x1200.


----------



## Dewdman42 (Nov 7, 2018)

Nick Batzdorf said:


> They're not cheap Costco screens, they're good Samsung screens sold at Costco!



Uhm, ok. Its all relative. Not all monitors are created equal. There are cheap ones, even from Samsung, or you can pay more for a good one. I'm not going to belabor the point much more but I've been doing a lot of research in the past week or two and clearly there are still QHD monitors available, they tend to be in the $200-300 range, including top rated models of that resolution. I'm not talking about the Costco special, which I would expect if you could get a costco 2k monitor it would be $150.

Meanwhile, top rated 4k monitors are $500-900 in price, plus needing a bigger video card. They are double, or more in price when comparing similar quality monitors against each other in the two different resolutions. 

So the gist of this is that if you're going to run your monitor at 2560x1440 anyway, you may be better off buying an actual QHD monitor, saving yourself 60% of the price and running it at its native resolution, rather then degrading the picture by reducing the resolution.

I can think of only a couple reasons to go all the way to a 4k 32" screen, which may or may not be worth the extra money:

You want a tweener resolution between 2560x1440 and 3840x2160, and you're ok with the fuzziness from bumping the resolution down.
You can get HiDpi mode to work correctly at 2560x1440 or better and the picture looks as good or better then native resolution.
You play a lot of games or watch a lot of video and want 4k playback for that.


----------



## Nick Batzdorf (Nov 7, 2018)

Dewdman42 said:


> When you say its not an improvement, does it look as good as Native? How does 1920x1080 (hidpi) compare to normal 1920x1080?



It looks as good as native. 



> There is definitely something a bit screwy about your setup as I don't think you should be seeing any hidpi bigger then native. You have a lot of extra resolutions there, so obviously you've been having some fun with SwitchResX, but that is why you see some hidpi resolutions larger then native, which are really frankenstein resolutions that shouldn't be there.



Right, I can only get those resolutions with SwitchResX - and that's what it does: unlock resolutions your graphics hardware is capable of producing by macOS doesn't offer.


----------



## Dewdman42 (Nov 7, 2018)

Nick Batzdorf said:


> It looks as good as native.


Thanks. In my mind that is crucial and a good reason to use that mode rather then just bumping it down to non-HiDPI lower res. HiDpi just does a smarter job of bumping down resolutions in general.

The downside of HiDPI is that it does cause OS X itself to think the display is much bigger then it really is, which can potentially slow it down. But I dunno, if its really as clear as Native and marginally better then non-hidpi 1920x1200, then its the way to go...though...twice the price for the hardware...but I digress. You can always run it at full native resolution if you feel like putting it closer to your face.



> Right, I can only get those resolutions with SwitchResX - and that's what it does: unlock resolutions your graphics hardware is capable of producing by macOS doesn't offer.



My understanding is that SwitchResX also has to call some OS X routines that do a bunch of low level verifying. That's why, for example, I am not able to unlock those resolutions on my system with SwitchResX. You are able to because you have an RX560 which is more capable of doing that.


----------



## Nick Batzdorf (Nov 7, 2018)

Right, and also because I have a 4K monitor attached. I have to "wiggle" a little (not worth explaining) to get all those res to show up for the Cinema Display. It doesn't do that automatically when I start up.

You can unlock the System Integrity Protection and have it store that stuff permanently, but I couldn't be bothered.


----------



## Dewdman42 (Nov 7, 2018)

Nick Batzdorf said:


> Right, and also because I have a 4K monitor attached. I have to "wiggle" a little (not worth explaining) to get all those res to show up for the Cinema Display. It doesn't do that automatically when I start up.



Ah yea that makes sense.

I'd love to know if your 4k monitor can do anything tweener between 2560x1440(hidpi) and 3840x2160...and still look AS GOOD AS native in terms of sharpness. 


Found a great article that explains HiDPI for anyone interested. best explanation i have seen anywhere yet.

https://www.anandtech.com/show/6023/the-nextgen-macbook-pro-with-retina-display-review/6


And this one too:


----------



## Nick Batzdorf (Nov 7, 2018)

Okay, I discovered something interesting.

At the HiDPI resolutions, text is a lot less jagged when you zoom in on it. I mean screen zoom - Control + slide up on the mouse - as opposed to Command and +.

So that must mean it's sharper to start with. It isn't noticeable, but it must be.


----------



## Dewdman42 (Nov 7, 2018)

Yes it is. 

First off with HiDPI/Retina you have an internal canvas behind the scenes that is 4x the size of the resolution you think you are seeing. So OS X is drawing everything on that canvas on a much finer level of detail. And of course when you zoom in that detail would become more visible. 

If you put your display in one of the lowest HiDPI resolutions you will see that zoomed in level of detail that OS X is rendering things onto that internal canvas. And I suspect that Apple is also doing a lot of smart stuff to make corners and fonts and things look smoother, even at that finer level of detail and probably continues to make things even finer and finer the more you zoom in or use a lower level HiDPI resolution. The HiDPI just gives OS X something much better to work with in terms smoothing edges, rounding corners and all that kind of stuff. Think "Font smoothing" or much much better because of the sub pixels that are now available.

If we had an actual Retina display, the pref pane looks different then what we see with 3rd party monitors. But it turns out its not actually doing anything different, its just displaying it in user friendly terms.

In the middle it shows the "Recommended Retina" setting, which is actually 1/2 the resolution of the actual monitor. 1/2 resolution is actually 1/4 as many pixels. In any case, that is when HiDPI translates the best, purest calculations. For us without Retina, we will see resolution that is exactly 1/2 of our native resolution, and it will have (HiDPI) next to it. That corresponds exactly to that middle box on the new pref pane for Retina hardware users.

If you select boxes to the left, you get lower resolutions which would actually have MORE fine detail shown, but the resolution would become lower, and probably undesirable. Those would correspond to lower HiDPI resolutions below the 1/2 one, that we see in our pref pane.

If you click on the two boxes to the right it basically selects higher resolutions that are closer to the native resolution, and those correspond to HiDPI resolutions we see on our list that are in between the 1/2 one, and full native. Except to get those, we have to manually set them up using SwitchResX by creating the double sized resolution for each one, then OS X figures out to give us 1/2 of the double as an additional HiDPI resolution.

The higher ones have really big internal canvases. Could be 6000xsomething er other, etc.. and probably are impacting performance, but who knows. And while you wouldn't see that much difference in detail while looking at the screen, compared to normal native, as you said, when you zoom in, the extra detail of that internal canvas would show you the nice edges...


----------



## Nick Batzdorf (Nov 8, 2018)

Well, it's largely academic for me, because my 16:10 monitor at 2560 x 1600 isn't a HiDPI and I don't have a Retina display.


----------



## Dewdman42 (Nov 8, 2018)

If you are using 2560x1600 hidpi mode then this ex is using a larger interval canvas and so not entirely academic when it comes to zooming in. But with zoom it would be a moot point


----------



## Nick Batzdorf (Nov 8, 2018)

I mean there is no 2560 x 1600 HiDPI mode available. They're all 16:9, not 16:10, and it's not worth giving up about 2" of vertical screen for that.

Also, some of the HiDPI resolutions close to the native one result in the cursor showing about an inch from where it really is. So to get back to regular res, I have to put the cursor about an inch above the SwitchResX icon.

4K res doesn't do that, and they don't all, but it's not a happening thing.


----------



## Dewdman42 (Nov 8, 2018)

I'm pretty sure that if I were in your shoes I would just use 2560x1600 native also. 

Just be clear, if you wanted to see an HiDPI version of it..which I don't think there would be any advantage, but just to clarify, create a scaled resolution in SwitchResX of 5120x3200. Then you should see 2560x1600(hiDPI) magically appear on your list also. It would only make a difference for zooming in though, per your observation, and it would tax your computer more in order to manage the 5120x3200 background canvas. So not really worth it IMHO...and yea somewhat academic for you.

Thanks for entertaining this discussion and telling us your results because I think this discussion is beneficial to anyone running a 4k monitor or considering it. Using 2560x2160(hiDPI) makes a lot of sense with a 4k monitor for most people where the native resolution of 4k is too tiny to read usually, and hiDPI will provide better scaling down to 2560x2160, particularly in the 32inch size. 

A 43" monitor can make more sense perhaps to run it native, unless you move it 3 feet away, then you might need to also do it there, but its good to know this is possible.


----------



## Nick Batzdorf (Nov 8, 2018)

Okay, Ima try 5120.


----------



## Nick Batzdorf (Nov 21, 2018)

Dewdman42 said:


> Thanks for entertaining this discussion and telling us your results because I think this discussion is beneficial to anyone running a 4k monitor or considering it. Using 2560x2160(hiDPI) makes a lot of sense with a 4k monitor for most people where the native resolution of 4k is too tiny to read usually, and hiDPI will provide better scaling down to 2560x2160, particularly in the 32inch size.



Update after installing Mojave and the firmware update it requires:

- The RX 560 now displays a progress bar during startup, but I don't think it's early enough to choose the startup disk by holding down Option. You'd have to use the control panel to select a different drive - not that I care, because you always could start up from the recovery partition.

- The 30" Cinema now has several HiDPI resolutions available with SwitchResX, no poking around required for them to appear. They're all 16:9 vs. its native 16:10 ratio, and doubling 2560 x 1600 doesn't give you an HiDPI option.

- 3360 x 1892 at HiDPI is a useful resolution for the 30" display. It has 3/4" black bars at the top and bottom (because it's 16:9), but it's nice being able to use it.

- I still run my Samsung at 2048 x 1080 HiDPI (rather than 1920, the standard 1080p res). Why not.


----------



## Dewdman42 (Nov 21, 2018)

Hmm that is disappointing to hear that Mojave has tightened down on the hidpi resolutions it can do.


----------



## Nick Batzdorf (Nov 22, 2018)

Are HiDPI resolutions ultimately dependent on the OS? 'cause SwitchResX lets you put in custom resolutions that don't show up as HiDPI.

Also, I'm not disappointed. I just have a 16:10 monitor in a 16:9 world. And I haven't seen a monitor that looks as good as this one, regardless of resolution.

Also also, re: the startup progress bar, what I meant is that being able to start up from the recovery partition is the only reason I care; I can deal with using the control panel to select a different startup disk once every 14 months when I need to do that.


----------



## Dewdman42 (Nov 22, 2018)

The OS is a huge part of the equation as well as compliance by application software. On windows when you go into settings for the display, you choose a resolution and then you can choose a percentage for how to bump the size of fonts, windows, etc to be larger. That effectively puts windows in hidpi mode. On windows you can say 125%, 150%, etc as how much of a Hidpi magnification to use and it’s all happening in the OS.

On OS X, they do something similar but present it differently. The default retina mode is 200% magnification. If you have a Retina display they hide all the details of what resolution the monitor actually is but they are using 200% magnification to create the higher fidelity, handled in the OS. Retina is just a branded name for hiDPI. There is no way on OS X to specify 150% or 125%, for example and I assume it’s just always 200% of whatever resolution your monitor is or that OS X thinks your monitor’s resolution is. If with the RX560 and SwitchResX you can fool OS X into thinking your monitor is 5k, then retina technology in OS X will magnify it at 200% down to 2560x1440 hiDPI. If it’s not working then either you don’t have switchresx configured right or else Mojave is getting smarter about knowing what resolution your monitor actually is and only allowing 200% of that, which is dissappointing if so. The only way to get the equivalent of windows’ 125% and 150% modes is to fake OS X into thinking your display is bigger then it is and then using 200% of that, which the OS does.

Two things for you, when you create the larger resolution sizes you’re making them as “scaled” resolutions right? I assume so. The other thing is that when you upgraded to Mojave you might have turned off the mode for enabling the open ended hidpi stuff with non Apple branded retina monitors. Check out earlier posts where there are some links to articles for how to enable that via the command line and maybe you can get it turned back on somehow.


----------



## Nick Batzdorf (Nov 23, 2018)

Dewdman42 said:


> 2560x1440 hiDPI



It has a 2560 x 1440 HiDPI setting.

I have a 2560 x 1600 monitor.


----------



## Dewdman42 (Nov 23, 2018)

I thought you said earlier you didn’t. Sounds like you’re still good to go then


----------



## Nick Batzdorf (Nov 23, 2018)

Good to go at the regular standard DPI 2560 x 1600 resolution.

2550 x 1440 HiDPI requires giving up about 1-3/4" of vertical screen estate, and the only benefit is when you zoom in. There's effectively no visible difference when you're not zoomed.

But again, 3360 x 1890 HiDPI is a usable resolution for sequencing, and I was unable to do that without the RX 560. I haven't tried working at it for extended periods, but it's the difference between seeing 32 bars/57 tracks and 44 bars/70 tracks (the way I have Logic set up). For that it might be worth living with the black bands at the top and bottom of the screen.

And it looks very good.


----------



## Dewdman42 (Nov 23, 2018)

If you’re getting 3360x1890 hidpi then you’re system is handling hidpi fine I guess. It must mean you have a scaled resolution of 6720x3780 defined in switchresx.

If that works then you ought to be able to define other resolutions in switchresx which will eliminate the black bands. But anyway you have a two monitor setup so who knows what OS X is doing. Q


----------



## Nick Batzdorf (Nov 23, 2018)

Dewdman42 said:


> then your system is handling hidpi fine I guess



It is indeed, but there are also some silly resolutions as high as 7680 x 4320 - clearly intended to be halved for HiDPI - that are not scaled. So I don't think you can get SwitchResX to make HiDPI out of any old res.

What would be bitchin' is a way to fool the system into thinking it has a Retina display attached in order to scale the fonts as you described.


----------



## Dewdman42 (Nov 23, 2018)

So there is nothing special about retina displays they are just high resolution displays and Apple keeps a list of which monitor model numbers meet their criteria as being a Retina display. When it detects you have a compatible monitor it enables the different control panel GUI.

Switchresx definitely can create those scaled resolutions of just about any size you want, I thought you were already doing that. It depends on what your video card supports.

I’m fact when you see that 2560x1440 hiDPI, that is because somewhere in switchresx you have 5120x2880 defined as a “scaled” resolution. There is an actual type of resolution setting in switchresx called “scaled”, and when you create those youbare halfway to having the hidpi setup you want.

The other half is making sure OS X is configured to give you all those 200% hidpi resolutions based on the super big scaled resolutions defined in switchresx. There is a command line command to enable OS X to do that for you, or allegedly switchresx will do it for you. So you have to do those two things then you should in theory be able to setup hiDpi resolutions to any size and shape you want, but I can’t say more until I get my own rx580 to try it all out myself.

I think you should be able to get the hidpi resolutions you want without black sidebars, bass on everything you’ve said. But it’s also possible that Mojave is interfereing.


----------



## Dewdman42 (Nov 23, 2018)

Another thing I noticed when I was at an Apple dealer trying out their 5k Retina display, is that in the control panel if you hold option while choosing “scaled resolutions”, then it actually shows exactly the same stuff the rest of us see without a branded retina monitor, a list of resolutions, some of them being hiDPI.

There is nothing special about the retina branded monitor other then a simplified control panel GUI that is easier for most people to understand.

But we don’t really know for sure what Mojave May May not enable with branded retina monitors. With the option key trick mentioned above I was able to spoof a retina 5k entirely out of retina mode.


----------



## Nick Batzdorf (Nov 23, 2018)

Well, before trying to get it to create 5120 x 3200 scaled I need to know what else to enter in those boxes.

This seems like an easy way to create Great Whopping Clusterfongula.


----------



## Dewdman42 (Nov 23, 2018)

There is nothing else to enter. Sounds like you are not creating your custom resolutions correctly for this particular thing. You have to make sure they are "Scaled" resolutions.

So open SwitchResX, and select your monitor on the left pane, you will see on the right, something like this:






Then look at the tabs for that panel on the right and select the one called "Custom Resolutions", which will then look something like this, except yours may have a lot more already added since you've been fiddling around:






Click the + button at the bottom to add a new resolution. Now here's the part I think you've been missing: near the top you will see a control that says "custom resolution". Change that to "scaled Resolution". Then the editor will look like this:






Just enter the new size you want and save it.

That's it.


----------



## Dewdman42 (Nov 23, 2018)

Or follow this link (and others) to see how to use SwitchResX for custom HiDPI modes:


----------



## Nick Batzdorf (Nov 24, 2018)

Right, right. The problem is that it doesn't show up as HiDPI. 5120 x 3200 is in the list of resolutions, but not as HiDPI.

Actually, this screen dump was before I enabled it. The resolution works fine, but of course you need binoculars to use it - it's not scaled down to 2560 x 1600.

Again, I suspect the 16:10 ratio of being the issue.


----------



## Dewdman42 (Nov 25, 2018)

Nick, when you add 5120x3200 you should now see 2560x1600HiDPI.

However the fact that it says "inactive" on the far right means you didn't add is successfully. Perhaps you have that resolution already added as non-scaled or something.. Either that or RX560 doesn't support that size. I don't think its related to the ratio other then you need to add those desired resolutions correctly.


----------



## Nick Batzdorf (Nov 25, 2018)

There's no way to install it successfully. I've entered the command line thing to unlock the system, etc.

That resolution isn't entered anywhere else - it just isn't happening unless there's some convoluted way to do that not in the documentation.


----------



## Dewdman42 (Nov 25, 2018)

Its possible the RX560 does not support that scaled resolution then, or perhaps there is a step you're missing in SwitchResX. Otherwise I'd suggest sending an email to the author of SwitchResX, he is very responsive. I think we've beaten this to death. Good luck.


----------



## Dewdman42 (Dec 4, 2018)

I finally got my RX580 and just wanted to report back about HiDPI with it.

I also had problems getting SwitchResX to recognize larger scaled resolutions, they simply wouldn't stick. Not sure why. I gave up on it. I found a better solution that worked like a charm and I can create literally any HiDPI resolution I want, including both larger and smaller then native resolution, though the larger ones don't look good anyway.

I used the following two tools:

https://github.com/syscl/Enable-HiDPI-OSX

http://avi.alkalay.net/software/RDM/

The first is a command line script and it prompts you for the HiDPI resolutions you want which you add one at a time. If you run it once without any, it clears them all out.

The second is a free tool called RDM which is basically a simple and free alternative to SwitchResX.

I was able to get 4k resolution displayed on my 1920x1200 monitor. Can't read the fonts, but there it is...works. I was also abel to get 1920x1200HiDPI on this monitor. I think the native 1920x1200 looks just a tad cleaner then the HiDPI version, but not by much. If you get some zoom in advantage with the HiDPI version, then there you go. I was also able to add some other HiDPI modes below native, at 90%, 80%, 75%, etc.. and they all look better then the normal scaled down resolutions....so that's a win and look forward to using this with my soon to arrive 4k monitor.

Here are screenshots of 5k, 4k, 1920x1200hidpi, 1920x1200native


----------



## Dewdman42 (Dec 6, 2018)

My 32" 4k came in (LG 32MU99). All I can say is that 32" 4k at 3008x1692(hiDPI) is awesome. Clear as a bell and very retina-like, lots of real estate, not too small to see. I can definitely say that 32" 4k is the way to go, with HiDPI settings to zoom in just a little bit closer then native 4k, which at native resolutions is just a little too small to read at normal viewing distances. 

3200x1800(hiDPI) is also quite good if you sit a little closer. I have to wear progressive lenses glasses and at that resolution its too hard to move my head around and focus in on stuff, its a tad too small.

3360x1890(hiDPI) also works, and that is definitely too small for me to read from 2-3 feet, but with some apps that use larger fonts or hardly any fonts, its quite clear and usable with more real estate, as is the native resolution 3840x2160. I have compared native resolution against 3840x2160(hiDPI) and the hiDPI version definitely works, but its ever so slightly less clear. I'd say only use if it you are getting some of the zoom advantage that Nick mentioned, but I'm not sure how he was experiencing that. In any case, I find native resolution to be way too small for normal use, the fonts are just tiny for reading email, etc. but if you want to look at a lot of tracks in LPX...it certainly is clear and sharp and lots of real estate...and its not hard to swap to different resolutions as needed.

I find that for day to day normal use, 3008xx1692(HiDPI) is just about right from 2-3 feet away, possibly 3200x1800(HiDPI).

Interestingly, 3008x1692 works out to 108ppi font sizing (after the hiDPI effect). 110ppi is generally known as the magic perfect size for normal and typical desktop use. Anything less than 110 will be just a little "larger" and easier to read from further away. Anything more will be tinier and hard to read without putting your face close to the monitor. I would agree with that general observation about 110, and what I can say is that a 32" 4k monitor running 3008x1692(hiDPI) is at around 108 and is just about right for typical desktop use. Looks great too. Very pleased.


----------

