Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Nunyabinez

macrumors 68000
Original poster
Apr 27, 2010
1,758
2,230
Provo, UT
Hey all, I hope this is the right place to post this question. First, let me give some context.

I have a iMac with retina that I absolutely love. I have been gaming via Boot Camp, but find that it is just too much hassle to boot back and forth. So, I'm dusting off my game PC which is a pretty beefy machine.

However, am unsure about what to do for a monitor. I love the 5K display on my iMac, but when you run Boot Camp, 4K is the max resolution that Windows can do. And the iMac can't really run games at 4K anyway, so I play at 1440.

I have assumed that I am taking advantage of the pixels on the iMac since 1440 is exactly ¼ of 5K. But I am not certain that this is right. The reason that I am asking is that I was assuming that if I get a 4K TV and run at 1080 I will get the same effect, namely 4 pixels for each one the video card is sending.

Is there anyone out there who can shed some light on this? Was I deluding myself that games at 1440 on my iMac were taking advantage of the pixels? And am I going to get a better looking game running at 1080 on a 4K than I would a good 1080 monitor?

if you're just guessing please don't put your guesses here, I can guess myself, but someone here likely has more technical knowledge about the pixel/resolution relationship that would help me decide on buying or not buying a UHD set.

Thanks in advance for your help.
 

xSinghx

Suspended
Oct 2, 2012
308
87
The short answer is if you're playing at a desk buy a monitor, if you want to move to a couch buy a TV.

A 20-30in TV for a desk will likely be 720p and have a slow refresh rate. If you're gaming at a desk a 1ms monitor with full hd resolution is what you want to avoid screen tears and have a smooth and visually rich experience.

As for your question:
1080 pixels are the same for both a 4K and HDTV. The difference between the HDTV and 4K will be the other features on the TV that account for brightness, contrast, refresh rate etc making the picture either better or worse.

As Cnet puts it, "...[4k] still costs hundreds more than non-4K LCD TVs, and the difference in extra detail between 4K and 1080p is still basically invisible at normal screen sizes and viewing distances. But unlike Samsung's flat 1080p TVs, this one incorporates the company's best picture-enhancing features, namely local dimming, leading to a better image regardless of resolution." http://www.cnet.com/products/samsung-un55hu8550/

4K is a marketing gimmick (much like 3D blu-ray was a few years ago) to get you to want to buy a new TV (perceived obsolescence). I would question anyone that thinks it's worthwhile.
 

Nunyabinez

macrumors 68000
Original poster
Apr 27, 2010
1,758
2,230
Provo, UT
*Sigh* Unfortunately, it looks like no one here actually understands the actual technology of high resolution displays any better than I do.
 

xSinghx

Suspended
Oct 2, 2012
308
87
*Sigh* Unfortunately, it looks like no one here actually understands the actual technology of high resolution displays any better than I do.

Well that's one way to show a lack of appreciation for an answer to your question.

And certainly a warning to others not to bother engaging with you.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,440
936
Some say that using the retina iMac at exactly 1/4 of the resolution isn't as sharp as using a non-retina iMac. Just like non-retina aware apps look quite blurry on a retina display, much more blurry than on a standard display (I haven't tested myself). So it's possible that Apple does some trick to "force" people to use the full resolution.
But in theory yes, it's better to use integer scaling. I say in theory, because in practice, I'm not sure that using integer scaling really makes a visible difference. For instance, if I enable overscan on my plasma TV, I no longer have exact pixel mapping between he video source and the screen, but I don't see the image getting more blurry in any way.
So in short, this shouldn't be a factor.
 

Nunyabinez

macrumors 68000
Original poster
Apr 27, 2010
1,758
2,230
Provo, UT
Well that's one way to show a lack of appreciation for an answer to your question.

And certainly a warning to others not to bother engaging with you.

You didn't answer my question which means that you didn't understand my question. So, I'm sorry you didn't feel appreciated for answering a question that I didn't ask. But if it makes you feel better, thanks for trying.
 

Renzatic

Suspended
You pretty much the theory down already. If you want to get the best image quality when downscaling to a lower resolution on a fixed resolution monitor, you want to use resolutions that scales from quarter to half to full. This way, it maintains perfectly square "virtual" pixels 4 to 1 or 8 to 1, preventing what I think of as "the crawlies". It's kinda like nearest neighbor scaling pushed into an odd aspect ratio in a way, where the screen tries to interpolate a square pixel into what sees as a rectangular space, cutting off bits and pieces here and there to fit it in.

That said, some monitors do downscaling better than others. I've seen a few that can do an odd resolution, yet still look good, and others that look bad even when scaling down by quarters and halves. Like anything, you'll get the best results by spending the extra cash, and going with the best built and best designed.

----------

Curious why you're going for a 4K TV instead of a 4K monitor

My guess would be because they're both about the same thing, other than the fact TVs tend to be a lot larger.

This isn't like the old days when TVs were stuck at 640x480 max, while a monitor could display all the way up to 1600x1200. Now, the only difference is a monitor might have some USB ports.
 

Nunyabinez

macrumors 68000
Original poster
Apr 27, 2010
1,758
2,230
Provo, UT
Some say that using the retina iMac at exactly 1/4 of the resolution isn't as sharp as using a non-retina iMac. Just like non-retina aware apps look quite blurry on a retina display, much more blurry than on a standard display (I haven't tested myself). So it's possible that Apple does some trick to "force" people to use the full resolution.
But in theory yes, it's better to use integer scaling. I say in theory, because in practice, I'm not sure that using integer scaling really makes a visible difference. For instance, if I enable overscan on my plasma TV, I no longer have exact pixel mapping between he video source and the screen, but I don't see the image getting more blurry in any way.
So in short, this shouldn't be a factor.

The retina iMac is somewhat of a mystery since it is a one-of-a-kind machine. When I use it at 4K in Boot Camp it still looks fantastic. That is somewhat surprising, since that means that the iMac is scaling to a fractional resolution. I guess that it is actually the ATI card that is efficient at this scaling (though it could be the custom timing chip), but I wold assume that 1440 would look better because it would be assigning exactly 4 pixel for each 1 which would require no difficult math and should be pretty good. However both resolutions looked fantastic.

Some other research I did suggests that while displaying 1080 on a 4K display should look exactly the same, in practice it does not. Some claim that using 4 pixels rather than 1 results in the pixels being slightly more defined. Again, this seems counter intuitive, but similar to the example you gave where logically it should be one way, but turns out not to in reality.

One huge problem with TVs is going to be how scaling and upscaling are handled. One would assume that to get to 1080 on a 4K TV, the set would just assign 4 pixels to each 1, but apparently not all TVs do it correctly.

It seems like the only way to know for sure would be to get a specific UHD set and put it next to my iMac and run both at the same resolution. Or I could spend a ridiculous amount of money to SLI two monster cards together to run at 4K.

All in all, it seems that given the amount of uncertainty related to using 4K at lower resolutions, it may be a little too early to jump on the band wagon. I'll either keep using my 32" 1080p monitor, or get a nice 1440 monitor until UHD is a little more mature.

----------

Curious why you're going for a 4K TV instead of a 4K monitor

You pretty much the theory down already. If you want to get the best image quality when downscaling to a lower resolution on a fixed resolution monitor, you want to use resolutions that scales from quarter to half to full. This way, it maintains perfectly square "virtual" pixels 4 to 1 or 8 to 1, preventing what I think of as "the crawlies". It's kinda like nearest neighbor scaling pushed into an odd aspect ratio in a way, where the screen tries to interpolate a square pixel into what sees as a rectangular space, cutting off bits and pieces here and there to fit it in.

That said, some monitors do downscaling better than others. I've seen a few that can do an odd resolution, yet still look good, and others that look bad even when scaling down by quarters and halves. Like anything, you'll get the best results by spending the extra cash, and going with the best built and best designed.

----------



My guess would be because they're both about the same thing, other than the fact TVs tend to be a lot larger.

This isn't like the old days when TVs were stuck at 640x480 max, while a monitor could display all the way up to 1600x1200. Now, the only difference is a monitor might have some USB ports.

The main issue for me is price. TVs continue to be cheaper relative to monitors although obviously you are making some sacrifices as a device that is dedicated to a function typically is going to do it better than one that serves multiple functions.
 

Renzatic

Suspended
The main issue for me is price. TVs continue to be cheaper relative to monitors although obviously you are making some sacrifices as a device that is dedicated to a function typically is going to do it better than one that serves multiple functions.

To get elementary for a second, you really need to consider the size of the screen, and the distance you'll be sitting away from it to get the most out of it.

Think of it in Retina terms. I've got a 27" 1080p monitor, and in day to day use, it look pretty good. I set about 3-4 feet away from it, give or take. Now if I were to get a 46" 1080p TV, and set it the same distance away from me, all my text and icons would look considerably blurrier. That's because, quite simply, I'm sitting closer to a larger display that uses larger pixels. By opting for that bigger TV, I'm losing the pixel density I once had.

The same thing applies to 4k TVs. I don't think you can currently get a 4k TV under 40". Unless you're planning on sitting 8-10 feet away from it, it might be better just to keep the monitor you have, or spend the extra cash, and get a 4k monitor in the 24"-32" range.

edit: let me take this back a bit. Now that I think about it, 40" 4k TV would be on the cusp, and probably would make for a decent monitor.

Even if your primary concern is gaming (which I'm assuming it is, since you're posting here), the same theory applies in reverse. You're going to be spending a goodly chunk of cash buying a high resolution display that you're not going to take advantage of. You might as well get a 1080p TV, especially if you plan on using it in your living room, where the distance moots the differences between it and 1080p via 4k somewhat.
 
Last edited:

fuchsdh

macrumors 68020
Jun 19, 2014
2,020
1,819
Speaking from experience gaming on a Dell P2415Q, while you'd think that gaming at 1/4 the retina resolution (what the "normal resolution" would be for that size and a non-retina display would result in an essentially indistinguishable experience from a 1080p/1440p monitor, it's not the case. There's a small amount of blurring going on, although really you're only going to notice it in normal situations in a select number of scenarios (if you did an A/B on Dota 2, for instance, you'd be able to tell the menus and text are sharper at native 1080p versus 1080p on my 4K monitor.)

My suggestion is just to continue gaming on your very nice screen that you have. The pixels aren't "wasted", because other than gaming you're still getting an incredibly nice retina OS. It's certainly the cheapest option available to you.
 

Nunyabinez

macrumors 68000
Original poster
Apr 27, 2010
1,758
2,230
Provo, UT
Speaking from experience gaming on a Dell P2415Q, while you'd think that gaming at 1/4 the retina resolution (what the "normal resolution" would be for that size and a non-retina display would result in an essentially indistinguishable experience from a 1080p/1440p monitor, it's not the case. There's a small amount of blurring going on, although really you're only going to notice it in normal situations in a select number of scenarios (if you did an A/B on Dota 2, for instance, you'd be able to tell the menus and text are sharper at native 1080p versus 1080p on my 4K monitor.)

My suggestion is just to continue gaming on your very nice screen that you have. The pixels aren't "wasted", because other than gaming you're still getting an incredibly nice retina OS. It's certainly the cheapest option available to you.

Thanks for this. Anecdotal evidence from someone who has the equipment is really useful. Given that I have an iMac with retina if I want to watch 4K video I'm thinking that a solid 1440 monitor is going to be the sweet spot given the video card that I have and the fact that this will be used almost exclusively for gaming.

----------

Even if your primary concern is gaming (which I'm assuming it is, since you're posting here), the same theory applies in reverse. You're going to be spending a goodly chunk of cash buying a high resolution display that you're not going to take advantage of. You might as well get a 1080p TV, especially if you plan on using it in your living room, where the distance moots the differences between it and 1080p via 4k somewhat.

Good insights, thanks.
 

Liquorpuki

macrumors 68020
Jun 18, 2009
2,286
8
City of Angels
My guess would be because they're both about the same thing, other than the fact TVs tend to be a lot larger.

This isn't like the old days when TVs were stuck at 640x480 max, while a monitor could display all the way up to 1600x1200. Now, the only difference is a monitor might have some USB ports.

The main issue for me is price. TVs continue to be cheaper relative to monitors although obviously you are making some sacrifices as a device that is dedicated to a function typically is going to do it better than one that serves multiple functions.

I'm planning to make the 4K jump for gaming in a year and the main reason I'd go for a monitor instead of a TV is tech like Gsync or freesync, where the refresh rate is controlled by the GPU. You end up eliminating stuff like tearing

But yeah right now is everything is too expensive. Gsync monitor is $800. Graphics card setup is another $1500-$2000

A $1000 Titan X won't even do 60 FPS at 4K. Unless you wanna deal with SLI issues, the tech still has some ways to go
 

Renzatic

Suspended
A $1000 Titan X won't even do 60 FPS at 4K. Unless you wanna deal with SLI issues, the tech still has some ways to go

If you want the absolute best of the best performance without any worries whatsoever, you will need those dual Titan X's, no doubt. But most higher end GPUs are able to push 4k between 25-45 FPS on high these days. I'd almost be willing to bet that two SLIed 970's would probably net you more than acceptable performance, provided you do a bit of tweaking to maintain your framerates above 60 FPS.
 

xSinghx

Suspended
Oct 2, 2012
308
87
You didn't answer my question which means that you didn't understand my question.

There are a few other possibilities, namely one can view your question as muddled or inarticulate. Considering you didn't even specify how you intend to play said games via desktop or couch, distance etc. (and still haven't) it's a pretty pointless discussion to begin with.

*Sigh* Unfortunately, it looks like no one here actually understands the actual technology of high resolution displays any better than I do.

So, I'm sorry you didn't feel appreciated for answering a question that I didn't ask...

Maybe you could just stop layering your responses with attitude and no offense would be taken - then you wouldn't need to make disingenuous apologies.
 

antonis

macrumors 68020
Jun 10, 2011
2,085
1,009
I say in theory, because in practice, I'm not sure that using integer scaling really makes a visible difference.

It seems that in some cases it does. During my iMac days (27" 2010 model), and since the machine's performance was degrading, I had to reduce the native resolution more often. I always thought of using the exact half in order to have optimal results in display quality (same pixel ratio and all that).

In reality, the displayed quality differed greatly from game to game, and there were cases that a different resolution (e.g. a resolution that wasn't based on integer scaling) gave better results. Obviously, this is highly dependable from each game's graphics engine.
 

cluthz

macrumors 68040
Jun 15, 2004
3,118
4
Norway
That you need a Titan X for 4k is not true.
The latest games at max settings do, but most games doesn't need a Titan X.

I do run Nvidia surround and I always run at 60 FPS, always.

Games like Metro LL run 60 FPS on high/very high settings with 1x GTX780 (OC to 1142MHz), older games like Mass Effect 3 runs at 100+, same with Deus EX:HR. GTA V, with no MSAA, runs at very high at 60FPS. Tomb Raider, Skyrim and Borderlands, 60+ no issues too. GRID 2, Dirt Showdown all in the 100+ range.

Of my 500 games the only game I could not keep a 60FPS in is Dragon Age Inquisition, which had to be turned to medium to keep 60+, but ran at ~40 ish at very high settings.

I know a triple wide setup is a few less pixels that 4k, but you are pretty close.

So if you wanna play latest Battlefield at 4k, sure get a $1000 Titan X, but 99% of all games will run fine at 4k with a ~$300 card like GTX970, unless you have to max every slider there is.

Anyway, a 4K TV most likely has an input lag of 50ms+, which makes it unsuitable for 60FPS gaming.
 

Liquorpuki

macrumors 68020
Jun 18, 2009
2,286
8
City of Angels
That you need a Titan X for 4k is not true.
The latest games at max settings do, but most games doesn't need a Titan X.

I do run Nvidia surround and I always run at 60 FPS, always.

Games like Metro LL run 60 FPS on high/very high settings with 1x GTX780 (OC to 1142MHz), older games like Mass Effect 3 runs at 100+, same with Deus EX:HR. GTA V, with no MSAA, runs at very high at 60FPS. Tomb Raider, Skyrim and Borderlands, 60+ no issues too. GRID 2, Dirt Showdown all in the 100+ range.

Assuming high quality, I really don't know how you're getting 4K 60 FPS with only one GTX780

67703.png


Even a Titan X isn't hitting that at max settings

4K-Metro-LL-Redux1.jpg


Pretty much every benchmark I've seen says you have to use dual-SLI to guarantee 60 FPS across all games. I'd love to be proven wrong though because then I'll upgrade my computer after the Pacquiao fight
 

Irishman

macrumors 68040
Nov 2, 2006
3,401
845
You didn't answer my question which means that you didn't understand my question. So, I'm sorry you didn't feel appreciated for answering a question that I didn't ask. But if it makes you feel better, thanks for trying.

General advice for life - if you don't like the answers, don't ask the question.

You can't control those answers. :)
 

Nunyabinez

macrumors 68000
Original poster
Apr 27, 2010
1,758
2,230
Provo, UT
General advice for life - if you don't like the answers, don't ask the question.

You can't control those answers. :)

What you quoted from me clearly points out that he didn't answer my question.

Let me understand your logic. If I say what time is it and somebody says "Monday" I can't say "that's not what I asked?" Got it.

General advice: Don't give someone advice when they don't ask for it.
 

xSinghx

Suspended
Oct 2, 2012
308
87
What you quoted from me clearly points out that he didn't answer my question.

Well you asked several questions including some indirect ones but one of them is here:
"Am I going to get a better looking game running at 1080 on a 4K than I would a good 1080 monitor?"

So lets put the fiction you keep repeating to bed.

My answer was here:

The short answer is if you're playing at a desk buy a monitor, if you want to move to a couch buy a TV.

Notice my answer was the first line of my response and it didn't take 5 paragraphs to get to. A concise and clearly phrased question usually gets a quick and complete answer. Something to keep in mind if you bother readers to slog through another blog with a question again.

Let me understand your logic. If I say what time is it and somebody says "Monday" I can't say "that's not what I asked?" Got it.

General advice: Don't give someone advice when they don't ask for it.

His logic is fine, yours could use some work distinguishing singular from plural (questions). However, it's your lack of gratitude that's really at issue, especially given the speculative (I think most of us will assume complete hypothetical) nature of the thread, and absurdity of wanting to own a 4K TV to begin with. The spoiled entitlement (not getting exactly what you wanted - exactly how you wanted), and condescension littered throughout your responses begs the question, why should anyone engage you with an answer?

:rolleyes:
 

alphaswift

macrumors 6502
Aug 26, 2014
412
1,183
4K is a marketing gimmick (much like 3D blu-ray was a few years ago) to get you to want to buy a new TV (perceived obsolescence). I would question anyone that thinks it's worthwhile.

Ummm no. It's hard to find 4K content, but anyone taking a brief glance can appreciate the sharper image.
 

Renzatic

Suspended
So you're telling us a sharper image for content that doesn't really exist is worthwhile?

Feel free to explain how that's not a gimmick exactly.

Until then, you've given us the very definition of asinine.

There will be eventually.

And though you don't gain any bonus to sharpness by running a downscaled image on a higher resolution display, you do gain some richness in color and better saturation due to the thicker pixel density.

I wouldn't call 4k a gimmick. Especially not for computer displays, where you'll really appreciate that higher resolution the closer you sit to the screen. Though I will say that, for the moment at least, it's not 100% necessary. The jump to UHD from HD isn't quite as dramatic as the jump from SD to HD was, but it's still really nice.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.