Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Huntn

macrumors Core
Original poster
May 5, 2008
23,589
26,706
The Misty Mountains
I just got a ViewSonic XG3220 32” 4K 3820x2160 monitor. It was a hundred dollars more expensive than the ViewSonic VX3276 monitor, but has a better, beefier, more consolidated base, and is more adjustable. My office desk sits in front two windows, so a wall mount is not an option. Therefore I have limited real estate on my desk for a monitor to sit and a small rectangular base is preferred to one that has feet that angle out.

The graphics are amazing on this monitor as I imagine with all 4k monitors. However, there are some adjustments to lighting and gamma I’m still working through. With this monitor if I just start up a game I have been playing with my last monitor, also ViewSonic, it (the game) is significantly darker, with much deeper shadows. I have been adjusting games on a case by case basis. What I’m trying to do is a balance between the monitor settings and in-game setting.
F4B0BF90-61C2-471F-9A53-738D87CDBDBF.png
Of note, settings on this monitor are easy to access, with buttons on the side of the monitor and allows for saved custom settings, so it would be possible to have a custom setting tailored for a specific game to be saved and easily changed from one game setting to another, to a default setting. I just have to get up to speed as to what changes I need to make on the monitor versus what changes to make in-game.

Anyone know of a good guide as to methodology to do this?

I’ll mention three games, I’ve been playing a lot of.
  • World of Warships- dying looks awesome, I do that often. ;)
  • Red Dead Redemption 2- looks freaking amazing.​
  • Fallout 4- 2015 game, not that old, but it inexplicably has no resolution settings in-game. You can find a post, maybe several in the Steam forums about how to edit the Fallout4Pref file, to change the resolution of the game. But I tried this and got a funky result. As mentioned in one of the posts I looked at, the UI not scale properly. But then I found another thread that that if you have an Invidia Graphic card, mentions using GeForce Experience, an application that allows you to adjust the video settings of your games. I knew about this, but in the past had a bad experience where I optimized a game using GE and had a terrible result. That was years ago. This time, I found Fallout 4 in the list of games that GeForce Experience recognizes. It showed what changes to settings it will make if the optimize button is selected. I took a chance and got a great result, with the resolution set to 3830x2160, and it looks normal, but darker, than the game looked before on the old monitor. I’ll be tweaking settings, which will probably include a custom monitor setting.​
 
Last edited:
  • Like
Reactions: velocityg4

Huntn

macrumors Core
Original poster
May 5, 2008
23,589
26,706
The Misty Mountains
Just figured out something else- A Display Port Cable is not equal to HDMI, if you have a graphic card like mine. My Nvidia GTX2070 card will not do HDR (High Dynamic Range) on a Display Port Cable. It has to be a HDMI cable to see the HDR option. My understanding is that Display Port cables should allow HDR, but apparently Nvidia did not include that in the 2070 card's firmware.
 

garnerx

macrumors 6502a
Nov 9, 2012
623
382
Just figured out something else- A Display Port Cable is not equal to HDMI, if you have a graphic card like mine. My Nvidia GTX2070 card will not do HDR (High Dynamic Range) on a Display Port Cable. It has to be a HDMI cable to see the HDR option. My understanding is that Display Port cables should allow HDR, but apparently Nvidia did not include that in the 2070 card's firmware.
I thought it was the other way around, you need displayport for HDR. At least on my card anyway (it's from the previous generation to yours).

HDR on these gaming monitors is massively underwhelming, I think. Most of them aren't true 10-bit panels, and none of them are bright enough or have sufficient local backlighting zones to do 'real' HDR anyway. They sort of fake it. As far as I'm aware, high-end TVs (particularly OLEDs) are the only places you'll find decent HDR, short of pro-level video production monitors.

Variable refresh rate, on the other hand, is fantastic. I rarely bother with HDR on mine now because it's so hard to tell the difference, but switching off v-sync in games and relying on the monitor's Freesync feature is a revelation. Mine's a different brand but yours also has that feature (as I'm sure you're aware).

edit: reading reviews and stuff, I think the HDR/HDMI thing is specified by your monitor (because it's DP 1.2) and not by the graphics card. But if you want Freesync you will need DP for that, I don't think Nvidia has ever supported it over HDMI.
 
Last edited:

Huntn

macrumors Core
Original poster
May 5, 2008
23,589
26,706
The Misty Mountains
I thought it was the other way around, you need displayport for HDR. At least on my card anyway (it's from the previous generation to yours).

HDR on these gaming monitors is massively underwhelming, I think. Most of them aren't true 10-bit panels, and none of them are bright enough or have sufficient local backlighting zones to do 'real' HDR anyway. They sort of fake it. As far as I'm aware, high-end TVs (particularly OLEDs) are the only places you'll find decent HDR, short of pro-level video production monitors.

Variable refresh rate, on the other hand, is fantastic. I rarely bother with HDR on mine now because it's so hard to tell the difference, but switching off v-sync in games and relying on the monitor's Freesync feature is a revelation. Mine's a different brand but yours also has that feature (as I'm sure you're aware).

edit: reading reviews and stuff, I think the HDR/HDMI thing is specified by your monitor (because it's DP 1.2) and not by the graphics card. But if you want Freesync you will need DP for that, I don't think Nvidia has ever supported it over HDMI.
I would have expected Display Port to include HDR. I have a GTX 2070 card. With the Display port cable that came with it, there was no HDR option under Display Properties. I researched online and it said some models of Nvidia GTX cards do not have HDR enabled for display port Connections.

Now the HDMI cable I was using with the previous 2k monitor did not work with the new monitor, and when I switched that out to a newer HDMI cable, I got an image, and an option for HDR. Now it is possible that the display port cable that came with the monitor was defective, and I did not troubleshoot it.

On my monitor, enabling HDR made a significant difference for at night scenes.
 

garnerx

macrumors 6502a
Nov 9, 2012
623
382
On my monitor, enabling HDR made a significant difference for at night scenes.
Well as long as you can tell the difference. It's definitely an 8-bit panel though - the HDR10 spec is for 10-bit panels with very high peak brightness, so a wider RGB range and about 3x the brightness.

If you have a DP cable kicking around you should try the adaptive refresh rate thing, see which of the two features you prefer. Shame you can't have both, really.
 
  • Like
Reactions: Huntn

faust

macrumors 6502
Sep 11, 2007
382
173
Los Angeles, CA
4K gaming isn't really cost effective right now. I had a 2080 Ti that I sold when it became apparent I wouldn't be playing games at 4K anywhere near 60 FPS unless on low-medium settings.
 

garnerx

macrumors 6502a
Nov 9, 2012
623
382
4K gaming isn't really cost effective right now. I had a 2080 Ti that I sold when it became apparent I wouldn't be playing games at 4K anywhere near 60 FPS unless on low-medium settings.
That's where the adaptive refresh tech comes in. This particular monitor is g-sync compatible, so I think the range for that is 48-60 Hz. As long as a game doesn't drop below 48 fps it will look very smooth indeed.
 

velocityg4

macrumors 604
Dec 19, 2004
7,330
4,719
Georgia
I just got a ViewSonic XG3220 32” 4K 3820x2160 monitor. It was a hundred dollars more expensive than the ViewSonic VX3276 monitor, but has a better, beefier, more consolidated base, and is more adjustable. My office desk sits in front two windows, so a wall mount is not an option. Therefore I have limited real estate on my desk for a monitor to sit and a small rectangular base is preferred to one that has feet that angle out.

If you want to minimize the footprint of the screen. You should consider a desk mounted VESA stand. They bolt straight through the desk or can clamp to the edge. Thus requiring minimal space. There are many options including articulated arms, multi-monitor stands and a wide range of height options. I just use a simple triple monitor stand. Which give me about 10" of clearance under my monitors and a single mounting point on the desk that's about 3" wide.
 
  • Like
Reactions: Huntn

Huntn

macrumors Core
Original poster
May 5, 2008
23,589
26,706
The Misty Mountains
If you want to minimize the footprint of the screen. You should consider a desk mounted VESA stand. They bolt straight through the desk or can clamp to the edge. Thus requiring minimal space. There are many options including articulated arms, multi-monitor stands and a wide range of height options. I just use a simple triple monitor stand. Which give me about 10" of clearance under my monitors and a single mounting point on the desk that's about 3" wide.
I found that the base of this stand is satisfactory. :)
 

cube

Suspended
May 10, 2004
17,011
4,972
I would have expected Display Port to include HDR. I have a GTX 2070 card. With the Display port cable that came with it, there was no HDR option under Display Properties. I researched online and it said some models of Nvidia GTX cards do not have HDR enabled for display port Connections.

Now the HDMI cable I was using with the previous 2k monitor did not work with the new monitor, and when I switched that out to a newer HDMI cable, I got an image, and an option for HDR. Now it is possible that the display port cable that came with the monitor was defective, and I did not troubleshoot it.

On my monitor, enabling HDR made a significant difference for at night scenes.
You need at least DisplayPort 1.4 for HDR.

Your monitor supports only 1.2 (old).

HDMI 2.0 does not support 4K@60Hz 4:4:4 10-bit . DisplayPort 1.2 already can do it.

Consumer cards have supported 10-bit only in DirectX under Windows.

NVIDIA now also supports 10-bit OpenGL on consumer cards from GTX 1050 with the Studio Driver.

If possible, I would return the monitor and get one with DP 1.4 .
 
Last edited:
  • Like
Reactions: T'hain Esh Kelch

Huntn

macrumors Core
Original poster
May 5, 2008
23,589
26,706
The Misty Mountains
You need at least DisplayPort 1.4 for HDR.

Your monitor supports only 1.2 (old).

HDMI 2.0 does not support 4K@60Hz 4:4:4 10-bit . DisplayPort 1.2 already can do it.

Consumer cards have supported 10-bit only in DirectX under Windows.

NVIDIA now also supports 10-bit OpenGL on consumer cards from GTX 1050 with the Studio Driver.

If possible, I would return the monitor and get one with DP 1.4 .
The image is beautiful on this monitor. If I have a concern it is why can’t the display port cable that came with it, produce a HDR capable image. I see three possibilities, either a limitation/issue of my GTX2070 or a problem with the monitor or the display port cable that came with it. Nvidia has a firmware tool that I ran, which told me it was not applicable for my hardware. This week, I’ll talk to both ViewSonic and Nvidia.
 

cube

Suspended
May 10, 2004
17,011
4,972
The image is beautiful on this monitor. If I have a concern it is why can’t the display port cable that came with it, produce a HDR capable image. I see three possibilities, either a limitation/issue of my GTX2070 or a problem with the monitor or the display port cable that came with it. Nvidia has a firmware tool that I ran, which told me it was not applicable for my hardware. This week, I’ll talk to both ViewSonic and Nvidia.
It would not be the cable. I saw the monitor supports only DP 1.2 (buyer error).

I saw a BenQ with DP 1.4 that is cheaper. Adaptive sync compatibility with NVIDIA depends on the firmware (not field updatable).

Same panel, but there are differences. Neither is DisplayHDR certified.

Neither supports HDR with adaptive sync (that would cost at least 900 euro - 27").
 
Last edited:
  • Like
Reactions: Huntn

garnerx

macrumors 6502a
Nov 9, 2012
623
382
It would not be the cable. I saw the monitor supports only DP 1.2 (buyer error).

I saw a BenQ with DP 1.4 that is cheaper. Adaptive sync compatibility with NVIDIA depends on the firmware (not field updatable).

Same panel, but there are differences. Neither is DisplayHDR certified.

Neither supports HDR with adaptive sync (that would cost at least 900 euro - 27").
^ This is correct.

What you're getting when you turn on HDR for that monitor is basically a jazzed-up set of picture settings - much like when you turn on sports mode or cinema mode on a TV. Which is why I suggested g-sync, as you'll probably get a lot more benefit out of having that switched on rather than HDR.
 
  • Like
Reactions: Huntn

cube

Suspended
May 10, 2004
17,011
4,972
^ This is correct.

What you're getting when you turn on HDR for that monitor is basically a jazzed-up set of picture settings - much like when you turn on sports mode or cinema mode on a TV. Which is why I suggested g-sync, as you'll probably get a lot more benefit out of having that switched on rather than HDR.
Note that he is using HDMI 2.0, so he is getting HDR10. But he is missing out on proper 10-bit then.

I don't know if the FreeSync is working with his NVIDIA card, but as I said, that would be without HDR.

G-SYNC Ultimate would cost at least 1100 euro - 27". Not much more expensive than the FreeSync2 HDR mentioned. And they all have DP 1.4 .

When looking at any resolution, the cheapest Ultimate is that one, but the cheapest FreeSync2 is 350 euro (1440p).

FreeSync might work with NVIDIA depending on the monitor, but FreeSync2 does not make sense in this case.
 
Last edited:

Huntn

macrumors Core
Original poster
May 5, 2008
23,589
26,706
The Misty Mountains
It would not be the cable. I saw the monitor supports only DP 1.2 (buyer error).

I saw a BenQ with DP 1.4 that is cheaper. Adaptive sync compatibility with NVIDIA depends on the firmware (not field updatable).

Same panel, but there are differences. Neither is DisplayHDR certified.

Neither supports HDR with adaptive sync (that would cost at least 900 euro - 27").
^ This is correct.

What you're getting when you turn on HDR for that monitor is basically a jazzed-up set of picture settings - much like when you turn on sports mode or cinema mode on a TV. Which is why I suggested g-sync, as you'll probably get a lot more benefit out of having that switched on rather than HDR.
Thanks guys for the info. Clearly I’m behind the curve on monitor specs. At this point I’ll be sticking with this monitor. I have not had it long enough, I might be able to return it, but for myself it‘s a clear upgrade at the right price. I payed $440 on sale and I had a $100 credit to apply to it for a 32” monitor. I don’t want to get into the $600 price range or higher and I’ve never seen a more beautiful image in a game, Red Dead Redemption 2 is stunning, based on what a I’ve previously experienced.

So my question is how much more stunning would a monitor with HDR be?
 

garnerx

macrumors 6502a
Nov 9, 2012
623
382
I don't know if the FreeSync is working with his NVIDIA card, but as I said, that would be without HDR.
According to the specs page it works (without HDR), and it's been included in the Nvidia drivers for a while now. Connect via displayport, then turn on g-sync in the Nvidia control panel.
So my question is how much more stunning would a monitor with HDR be?
Honestly, there's always something better out there, and next year there'll be something even better than that. The world of tech is an endless pursuit of minimal gains in the interest of throwaway consumerism.

If you have store that sells high-end TVs near you, go there and pretend to be interested in their biggest, most expensive OLED screens.
 
  • Like
Reactions: Huntn

cube

Suspended
May 10, 2004
17,011
4,972
Thanks guys for the info. Clearly I’m behind the curve on monitor specs. At this point I’ll be sticking with this monitor. I have not had it long enough, I might be able to return it, but for myself it‘s a clear upgrade at the right price. I payed $440 on sale and I had a $100 credit to apply to it for a 32” monitor. I don’t want to get into the $600 price range or higher and I’ve never seen a more beautiful image in a game, Red Dead Redemption 2 is stunning, based on what a I’ve previously experienced.

So my question is how much more stunning would a monitor with HDR be?
Your monitor does have (some would say fake) HDR.

There's a 43" 4K FreeSync Philips (DisplayHDR 1000), listed as G-Sync "compatible" for 500 euro, but it seems it is slow.

750 euro for 32" basic G-SYNC (DisplayHDR 1000).

Check if the FreeSync works on the ViewSonic.

If you return it you might get something better for less money, but if FreeSync does not check out I would give it back too:

 

Huntn

macrumors Core
Original poster
May 5, 2008
23,589
26,706
The Misty Mountains
According to the specs page it works (without HDR), and it's been included in the Nvidia drivers for a while now. Connect via displayport, then turn on g-sync in the Nvidia control panel.

Honestly, there's always something better out there, and next year there'll be something even better than that. The world of tech is an endless pursuit of minimal gains in the interest of throwaway consumerism.

If you have store that sells high-end TVs near you, go there and pretend to be interested in their biggest, most expensive OLED screens.
I’ll have to figure out how to do that. I update the card with GeForce Experience. Can you point me in the right direction? Update: I found an article posted below:
[automerge]1577113064[/automerge]
Your monitor does have (some would say fake) HDR.

There's a 43" 4K FreeSync Philips (DisplayHDR 1000), listed as G-Sync "compatible" for 500 euro, but it seems it is slow.

750 euro for 32" basic G-SYNC (DisplayHDR 1000).

Check if the FreeSync works on the ViewSonic.

If you return it you might get something better for less money, but if FreeSync does not check out I would give it back too:

G sync or freesync?
[automerge]1577113501[/automerge]
Btw, I found this article and it mentions screen tearing: https://www.howtogeek.com/270672/how-to-enable-optimize-and-tweak-nvidia-g-sync/

I’ve not noticed this at all, I have a nice smooth display. So would this be for the future when my card is getting old in the tooth? Is this why I would want to turn on G-sync of free sync? I have yet to see what the difference is between those 2 terms are (g-sync, freesync).

Just found it different terms for different cards: https://www.howtogeek.com/228735/g-sync-and-freesync-explained-variable-refresh-rates-for-gaming/
 
Last edited:

cube

Suspended
May 10, 2004
17,011
4,972
I’ll have to figure out how to do that. I update the card with GeForce Experience. Can you point me in the right direction?
[automerge]1577113064[/automerge]

G sync or freesync?
[automerge]1577113501[/automerge]
Btw, I found this article and it mentions screen tearing: https://www.howtogeek.com/270672/how-to-enable-optimize-and-tweak-nvidia-g-sync/

I’ve not noticed this at all, I have a nice smooth display. So would this be for the future when my card is getting old in the tooth? Is this why I would want to turn on G-sync of free sync? I have yet to see what the difference is between those 2 terms are (g-sync, freesync).

Just found it different terms for different cards: https://www.howtogeek.com/228735/g-sync-and-freesync-explained-variable-refresh-rates-for-gaming/
If you have an NVIDIA card, it is preferable to get G-SYNC, as it is certified. FreeSync2 is certified for AMD.

That is the inexpensive Benq I was talking about (FreeSync).

Your card is not powerful enough to play all games at 4K above 60fps, so you would get tearing sometimes without adaptive sync.

You can still get tearing with adaptive sync if you fall below the minimum rate for the monitor (eg: 40fps).

If you want to see some tearing, I would try Battlefield V maxed out (especially raytracing).
 
Last edited:

Huntn

macrumors Core
Original poster
May 5, 2008
23,589
26,706
The Misty Mountains
If you have an NVIDIA card, it is preferable to get G-SYNC, as it is certified. FreeSync2 is certified for AMD.

That is the inexpensive Benq I was talking about (FreeSync).

Your card is not powerful enough to play all games at 4K above 60fps, so you would get tearing sometimes without adaptive sync.

You can still get tearing with adaptive sync if you fall below the minimum rate for the monitor (eg: 40fps).

If you want to see some tearing, I would try Battlefield V maxed out (especially raytracing).
I’ve noticed there are a lot more monitors I’ve looked at with FreeSync than with G-Sync. This monitor supports FreeSync. I found an article about how to enable G-Sync on a FreeSync monitor: https://www.esportstales.com/tech-tips/how-to-enable-gsync-compatible-on-freesync-monitor

...however, in my Nvidia Control Panel, G-Sync is not even listed as an option. Now I don’t know if this is an old control panel or it’s updated.

I’ll check that out and talk to ViewSonic and Nvidia this afternoon after I finish my workout. The bottom line is I’m very happy with the display as it currently is. It is distinctly better than the 2k monitor, I was using before. It was released last year, and does not have the latest technology (my bad) but the price is definitely right. I imagine at some point as games push the cutting edge, screen tearing may become an issue.
 

cube

Suspended
May 10, 2004
17,011
4,972
I’ve noticed there are a lot more monitors I’ve looked at with FreeSync than with G-Sync. This monitor supports FreeSync. I found an article about how to enable G-Sync on a FreeSync monitor: https://www.esportstales.com/tech-tips/how-to-enable-gsync-compatible-on-freesync-monitor

...however, in my Nvidia Control Panel, G-Sync is not even listed as an option. Now I don’t know if this is an old control panel or it’s updated.

I’ll check that out and talk to ViewSonic and Nvidia this afternoon after I finish my workout. The bottom line is I’m very happy with the display as it currently is. It is distinctly better than the 2k monitor, I was using before. It was released last year, and does not have the latest technology (my bad) but the price is definitely right. I imagine at some point as games push the cutting edge, screen tearing may become an issue.
Did you activate FreeSync on the monitor itself first?
 

garnerx

macrumors 6502a
Nov 9, 2012
623
382
...however, in my Nvidia Control Panel, G-Sync is not even listed as an option. Now I don’t know if this is an old control panel or it’s updated.
If you've still got the HDMI cable plugged in it won't show up but it should be there if you use displayport instead.
This website says your monitor is compatible (most Freesync monitors are now).

The option is on this page:
Capture.JPG
 

Huntn

macrumors Core
Original poster
May 5, 2008
23,589
26,706
The Misty Mountains
Did you activate FreeSync on the monitor itself first?

I’ve not tried that yet. Heading out to swim, will report back later.
If you've still got the HDMI cable plugged in it won't show up but it should be there if you use displayport instead.
This website says your monitor is compatible (most Freesync monitors are now).

The option is on this page:
View attachment 884353
Thanks for your all’s help. I’ll investigate your suggestions and report back after I play with it and talk to the manufacturers.

Regarding HDR, it seemed that after I swapped out the DP cable with the HDMI cable, and got HDR turned on, the blacks in the game were more manageable. In other words in Red Dead Redemption 2, the night environment went from so dark I could not see anything in shadow, to where it was not pitch black without having to turn the gamma up way too much.
 

cube

Suspended
May 10, 2004
17,011
4,972
If you've still got the HDMI cable plugged in it won't show up but it should be there if you use displayport instead.
Then he would lose HDR because it is 1.2 .

NVIDIA does not support FreeSync through HDMI 2.0 (AMD proprietary).

Time to try to return the monitor.

Or maybe one can use 2 cables and have each feature always activated on "each display".

Then DisplayPort for FreeSync and 10-bit SDR.

HDMI 2.0 for 4K 60Hz 4:2:2 HDR.

But one would still miss 4K 60Hz 4:4:4 HDR which needs DP 1.4 .

1100 euro to add adaptive sync to HDR with NVIDIA.
 
Last edited:

garnerx

macrumors 6502a
Nov 9, 2012
623
382
I’ve not noticed this at all, I have a nice smooth display. So would this be for the future when my card is getting old in the tooth? Is this why I would want to turn on G-sync of free sync? I have yet to see what the difference is between those 2 terms are (g-sync, freesync).
G-sync was the first one but Nvidia charged a fortune to license it, so few monitors used it. Then AMD came up with their own version, Freesync, which is basically identical but anyone can use it for nothing, so it's in a huge number of monitors now. Nvidia essentially admitted defeat and adopted Freesync too, although they call it 'g-sync compatible'.

The benefit isn't so much about screen tearing, since that is most common when the game updates faster than your monitor, which isn't likely at 4K. With Freesync turned off the screen refreshes 60 time per second, regardless of whether a new frame has been sent by the graphics card. So if the game is varying between, say, 50 and 60 frames per second, you'll have some frames that stay on screen for one cycle (1/60 s), others that stay on screen for two cycles, and others that might miss their 'slot' and not get displayed at all. Which will manifest as visible choppiness or stuttering.

You can test it by turning off v-sync in the game's settings, then moving the camera across scenes of varying complexity. For example, you pan across the sky and it's very smooth, then you pan across a busy street and it's jerky.

Freesync sorts that out by updating the monitor only when there's a new frame sent from the graphics card, so if the game varies between 50-60 fps, the monitor will change frequency accordingly. It looks really good, it's compatible with pretty much any game, and it means you don't have to drop so many settings to stop the game stuttering.

Anyway, good luck with it! Some people are more sensitive to this stuff than others, there are plenty who prefer to just turn up the settings and have the game run at a steady 30 fps.
 
  • Like
Reactions: Huntn
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.