Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Romanesco

macrumors regular
Original poster
Jul 8, 2015
126
65
New York City
Hi there,

I'm wondering if I can drive an Apple's Pro Display XDR with a single Nvidia 2080 Ti hooked thru the Gigabyte Titan Ridge add-in card? I currently have two LG UltraFine 4K (2019) monitors working in this setup. Also, the video output for the nVidia card (see tech spec page) lists a max resolution of 7680x4320. The add-in card supports DisplayPort 1.4 with 8K video throughout.

For those with extended knowledge on these, care to advise?
 

joevt

Contributor
Jun 21, 2012
6,689
4,086
You are talking about Windows 10?

It should work unless there's a problem with the drivers or Apple did something weird to the Apple Pro XDR display to make it only work with macOS. I don't think they would do that because it needs to work in Boot Camp.

The Dell UP3218K works in Boot Camp at 8K60Hz. It uses two DisplayPort 1.4 cables. So 6K should work with the XDR.

The XDR requires two DisplayPort 1.4 signals over Thunderbolt 3. Connect two DisplayPort 1.4 outputs of the 2080 Ti to the GC-TITAN RIDGE. Then connect the XDR to a Thunderbolt 3 port of the GC-TITAN RIDGE. This has worked with the LG UltraFine 5K but don't know if it will work with the XDR. Some USB features will be missing (brightness, etc.). Brightness control might work using DDC/CI over DisplayPort.
 
  • Like
Reactions: Romanesco

Romanesco

macrumors regular
Original poster
Jul 8, 2015
126
65
New York City
You are talking about Windows 10?

It should work unless there's a problem with the drivers or Apple did something weird to the Apple Pro XDR display to make it only work with macOS. I don't think they would do that because it needs to work in Boot Camp.

The Dell UP3218K works in Boot Camp at 8K60Hz. It uses two DisplayPort 1.4 cables. So 6K should work with the XDR.

The XDR requires two DisplayPort 1.4 signals over Thunderbolt 3. Connect two DisplayPort 1.4 outputs of the 2080 Ti to the GC-TITAN RIDGE. Then connect the XDR to a Thunderbolt 3 port of the GC-TITAN RIDGE. This has worked with the LG UltraFine 5K but don't know if it will work with the XDR. Some USB features will be missing (brightness, etc.). Brightness control might work using DDC/CI over DisplayPort.

That’s exactly what I’m expecting. Will follow up with an update once I get my unit.
 

joevt

Contributor
Jun 21, 2012
6,689
4,086
That’s exactly what I’m expecting. Will follow up with an update once I get my unit.
I would be interested at seeing the EDIDs for the display. There should be at least two (one for each half of the display).

macOS has 6 override files for the display, each with a different product ID.
ae21
ae22
ae23
ae2d
ae2e
ae2f

One of them hints at a 12 bit per component mode (5K only). Others are 10 bpc with 5K and 6K timings. The 5K timings might allow it to work with dual DisplayPort 1.2 outputs.

I wonder which product ID you'll get, and how would the other product IDs appear? I know that some displays can have different product IDs depending on the port you connect it to, or if you change a mode.
 

TheRealAlex

macrumors 68030
Sep 2, 2015
2,909
2,121
Hi there,

I'm wondering if I can drive an Apple's Pro Display XDR with a single Nvidia 2080 Ti hooked thru the Gigabyte Titan Ridge add-in card? I currently have two LG UltraFine 4K (2019) monitors working in this setup. Also, the video output for the nVidia card (see tech spec page) lists a max resolution of 7680x4320. The add-in card supports DisplayPort 1.4 with 8K video throughout.

For those with extended knowledge on these, care to advise?
It Might work but all the internal Software settings and controls are MacOS exclusive you may get some Basic NVIDIA control panel settings.
But the real Pro settings are all in MacOS
 

joevt

Contributor
Jun 21, 2012
6,689
4,086
I got the Pro Display XDR working at 6016x3384@60Hz SDR 8 bit color with a 2080 TI on a Windows PC using the Moshi USB-C to DisplayPort Cable available from the Apple Store.
That means the display doesn't require USB at all.

Does the GPU-Z Advanced tab show anything about Display Stream Compression (DSC) or does anywhere else in Windows show DSC?
 

joevt

Contributor
Jun 21, 2012
6,689
4,086
That means the display doesn't require USB at all.

Does the GPU-Z Advanced tab show anything about Display Stream Compression (DSC) or does anywhere else in Windows show DSC?
If it's not DSC, it could be YPbPr 4:2:2. Can you check the Nvidia control panel to see if it's RGB or 4:4:4?

If it were YPbPr 4:2:2 then it could possibly do 10 bpc.
 

Lyot

macrumors newbie
Jan 10, 2020
3
6
I couldn't find anything about DSC in GPU-Z.

Here is the Monitor info from Advanced - General:

Monitor 1
Type DisplayPort
Link Rate (current) 5.4 Gbps
Link Rate (max) 5.4 Gbps
Lanes (current) 4
Lanes (max) 4
Color Format RGB
Dynamic Range VESA
Bit-per-Color 10
xvYCC601 N/A
xvYCC709 N/A
sYCC601 N/A
AdobeYCC601 N/A
AdobeRGB N/A
BT2020RGB N/A
BT2020YCC N/A
BT2020cYCC N/A
8-bit Color Supported
10-bit Color Supported
12-bit Color Supported
16-bit Color N/A

In the Nvidia Control panel the only option is RGB, but I can choose 8, 10 and 12 bit color depth. I changed it to 10 bit, but still no HDR options in Windows HD Color.

XDR Nvidia.png
 
  • Like
Reactions: 4ever4

Lyot

macrumors newbie
Jan 10, 2020
3
6
Using a USB-C cable (not Thunderbolt) from the VitualLink USB-C port on the card, the USB-C ports on the back of the display actually works. They run at USB 2.0 speeds. I get 34 MB/s transfer speeds to an external SSD. I used the USB-C cable that came with the LG UltraFine 4K (21,5 inch version).

I also tried 12 bpc mode. It seems to work, but I didn't notice any difference.
 
  • Like
Reactions: 4ever4

joevt

Contributor
Jun 21, 2012
6,689
4,086
Using a USB-C cable (not Thunderbolt) from the VitualLink USB-C port on the card, the USB-C ports on the back of the display actually works. They run at USB 2.0 speeds. I get 34 MB/s transfer speeds to an external SSD. I used the USB-C cable that came with the LG UltraFine 4K (21,5 inch version).

I also tried 12 bpc mode. It seems to work, but I didn't notice any difference.
That all seems to be as expected. 5.4 Gbps is HBR2 which is what we saw on a MacBook Pro 16" running 6K 12bpc with DSC (single DisplayPort 1.4 connection).

Since it's using 4 lanes for HBR2 and not Thunderbolt, then there can only be a USB 2.0 connection for the hub in the display.

I don't think anyone's confirmed USB 2.0 only connection with Thunderbolt 3 and DSC disabled yet. I mean, the speed might be less than USB 3.x but it could still be connected as USB 3.x and just not have enough bandwidth for 5 Gb/s USB. The connection type will show in System Information.app in macOS under USB for each USB connected device.
 

Menno87

macrumors member
Jul 18, 2008
78
74
Netherlands, the
-edit- A bit offtopic.. wrong thread hehe. Though I can confirm it works on a 2080 Ti EGPU Box also with a macbook pro 16". Same situation as Lyot. I can choose 6k resolution with 8/10/12 bit in nvidia panel but no HDR. (all modes work, if there is a difference.. I don't know.)
 
Last edited:

Bradleyone

macrumors regular
Jul 7, 2015
232
262
Sydney, Australia
I got the Pro Display XDR working at 6016x3384@60Hz SDR 8 bit color with a 2080 TI on a Windows PC using the Moshi USB-C to DisplayPort Cable available from the Apple Store.

Did you have to do anything special to get this working? Was it using the standard Windows 10 Plug n Play video driver?

I had a quick try using this cable on a 1080 Ti and I couldn't get it to display anything other than 640x480 on the XDR.

Unless there are extra steps, I wonder what's different about the 2080 Ti over the 1080 Ti. According to the Gigabyte specs page, the 1080 Ti supports DisplayPort 1.4 with a max resolution 7680x4320 @ 60Hz.
 

joevt

Contributor
Jun 21, 2012
6,689
4,086
Unless there are extra steps, I wonder what's different about the 2080 Ti over the 1080 Ti. According to the Gigabyte specs page, the 1080 Ti supports DisplayPort 1.4 with a max resolution 7680x4320 @ 60Hz.
The specs don't say if the 1080 Ti supports Display Stream Compression (DSC).
The specs don't say if the 8K60 mode is using a single cable (requires DSC) or two cables (like the Dell UP3218K).

Even if DSC is not supported, you should be able to get 4K or 5K. What does Monitor Asset Manager (moninfo.exe) say about the EDID? Try making a custom resolution with Nvidia Control Panel? Check GPU-Z Advanced tab for connection information (DisplayPort version, speed, lanes).
 

Bradleyone

macrumors regular
Jul 7, 2015
232
262
Sydney, Australia
Even if DSC is not supported, you should be able to get 4K or 5K. What does Monitor Asset Manager (moninfo.exe) say about the EDID? Try making a custom resolution with Nvidia Control Panel? Check GPU-Z Advanced tab for connection information (DisplayPort version, speed, lanes).

Thank you for the tips. They've put me on the right track, even though I haven't solved it yet.

I managed to create a custom 4K resolution in the nVidia control panel, which works but the timing is wrong, as shown by the lines on my photo below.

E69B1C37-3313-4706-97B9-98FC517C7481.jpeg


I think the card may actually be fine (GPU-Z info on left, before the custom resolution), but it's the EDID that's the problem, or rather the complete lack of one.

Monitor Asset Manager says there is just the basic generic one active - the correct one hasn't been read from the Pro Display XDR for some reason.

2C45D5DA-C5C7-403A-AE59-E43A229AE481.jpeg


My (limited) understanding is that the EDID should have been sent by the monitor to Windows when it was first detected.

For some reason that hasn't happened, which is why neither Windows nor the nVidia Control Panel know anything about changing resolutions on the XDR.

Before I have another session, I'm going to see how the XDR is configured under Boot Camp on a Mac Pro, and possibly copy the EDID from there to somehow manually enter into Windows on the PC.
 

joevt

Contributor
Jun 21, 2012
6,689
4,086
I managed to create a custom 4K resolution in the nVidia control panel, which works but the timing is wrong, as shown by the lines on my photo below.

I think the card may actually be fine (GPU-Z info on left, before the custom resolution),
Why does Link Rate show only 1.6 Gbps? That's not a normal DisplayPort Link Rate. DisplayPort 1.4 is 8.1 Gbps (HBR3). DisplayPort 1.2 is 5.4 Gbps (HBR2). DisplayPort 1.1 is 2.7 Gbps (HBR). 1.6 Gbps might be 1.62 Gbps (RBR).

Why does it show only 1 lane? There should be four lanes.

Maybe you are showing info for a low resolution? Or for a different DisplayPort connection for a different display?

How many displays are connected?

How are you connecting the display? What GPU? What port? What cable?

When creating timings for LCD displays, use CVT-RB or CVT-RB2 timings. GTF or CVT are for old CRT displays (but it's interesting you got a picture with GTF). The usual DisplayPort pixel clock for 4K is 593 MHz (CVT-RB) (HDMI 2.0 uses 594 MHz).

Does the GPU-Z Advanced Tab show different info when you switch to 4K resolution?

but it's the EDID that's the problem, or rather the complete lack of one.

Monitor Asset Manager says there is just the basic generic one active - the correct one hasn't been read from the Pro Display XDR for some reason.

My (limited) understanding is that the EDID should have been sent by the monitor to Windows when it was first detected.

For some reason that hasn't happened, which is why neither Windows nor the nVidia Control Panel know anything about changing resolutions on the XDR.

Before I have another session, I'm going to see how the XDR is configured under Boot Camp on a Mac Pro, and possibly copy the EDID from there to somehow manually enter into Windows on the PC.
Yes, the EDID is strange. The manufacturer should be APP for Apple. Color bit depth should be 10 or 12, not 6. etc. Did you check all the other EDIDs listed?

Anyway, a 5120x2880 60Hz RGB 8bpc (CVT-RB) timing should work if you can get an HBR3 connection from the GPU. But maybe single cable HBR3 is not an option for the XDR? I haven't seen AGDCDiagnose output from macOS for this mode or dual cable HBR3 mode (Thunderbolt 6K)).
 

Bradleyone

macrumors regular
Jul 7, 2015
232
262
Sydney, Australia
Well, I got it working, but it wasn't pretty. I'll describe what I did in case it sheds any light on what went wrong.

As I understand it, every model of digital monitor has a hardware id (7 characters) with an EDID (128 bytes), which is communicated to the OS when the cable is plugged in. The EDID provides enough information about resolutions and timings that the OS can immediately make use of the new monitor with no driver required.

In the case of the Apple Pro Display XDR, its hardware id is APPAE2E. My PC's normal Dell is DEL40B6, etc. All these should be communicated as soon as the cable is plugged in.

This is where the rabbit hole starts: when the Pro Display was plugged in, it wasn't being identified as APPAE2E and the appropriate EDID was not sent. Instead, it was identified as NVD0000, which seems to be some sort of generic nVidia identifier that specifies 640x480 as its only display mode.

So, I had to get the Pro Display's EDID manually into Windows. My first thought was to go to Boot Camp on a Mac Pro, which I started doing, but then had an instinct that maybe it was the nVidia card playing tricks. Instead, I plugged the Pro Display into the PC's motherboard integrated graphics DisplayPort.

Perfect. Plug n Play working exactly as it should. Windows identified the Pro Display immediately and offered all resolutions (which maxed out at 3840x2160 on the Intel 630 integrated graphics).

Back on the nVidia 1080 Ti, it was same as before, (un)identified as NVD0000, but at least Monitor Asset Manager showed the real EDID had been stored in the Registry.

(Side tunnel of the rabbit hole #1: Monitor Asset Manager offers a save .INF option which effectively lets a user install an EDID for that monitor. Except, if attempted, Windows 10 shows a “The third-party INF does not contain digital signature information.” error and won't continue. Now, there are at least 10 solutions on the web on claiming to get around this. 5 of them have the same command line misprint: bcdedit /set loadoptions DDISABLE_INTEGRITY_CHECKS. 1 of them actually works. It didn't solve the problem though.)

(Side tunnel of the rabbit hole #2: it's unpleasant to futz around on Windows using a bad resolution so at some point I entered the correct values for the Pro Display's 4K one. The nVidia Control Panel allows custom resolutions to be created, except you need to know the timings, which are of course contained in the EDID. So, a simple viewer to extract Front Porch etc timings from the EDID should be available, right? A search turned up 5. One refused to install without .NET 2.0 installed (and only that version). One was dead linked. One gave strong indications it was malware infested. One required forum registration. One actually worked, so my thanks to AW EDID Editor which spat out the right numbers and created a sharp resolution.)

Nothing I seemed to do could get the Pro Display to be identified as anything other than NVD0000, so my last resort was to use Registry Editor to manually copy the EDID data from the real APPAE2E entry over to this useless generic entry that seemed bound to the Pro Display when it was plugged into the 1080 Ti.

This unsightly foolish hack actually sort of worked. Monitor Asset Manager showed the active EDID was now the Pro Display's.

Finally I had a working Pro Display plugged into my 1080 Ti, correctly identified and showing all resolutions. The 1080 Ti maxes out at 5K (which is fine for me, since I just wanted 4K to see what it would be like for gaming).

01F5FD39-EFA0-4753-9BF7-4CF9FF9BC70A.jpeg


I wasted a lot of time and feel very stupid for not knowing why the Pro Display didn't pass through its EDID when connected to the 1080 Ti, but did when connected to the motherboard's integrated graphics.

Any hints?
 
  • Like
Reactions: Theophilos

joevt

Contributor
Jun 21, 2012
6,689
4,086
One actually worked, so my thanks to AW EDID Editor which spat out the right numbers and created a sharp resolution.)
AW EDID Editor only supports EDID and CTA extension block. It's missing support for DisplayID. edid-decode is a unix program that will dump all the information in an EDID (I mean everything - except some AMD, Nvidia, or Apple specific stuff for which there is no documentation). I don't know a good editor. I made a script that can modify some stuff

Windows identified the Pro Display immediately and offered all resolutions (which maxed out at 3840x2160 on the Intel 630 integrated graphics).
In Windows, Intel graphics can do wider than 4K with a lower height: 5120x1440 (but you can't make a custom resolution like that unless such a display is connected?) macOS won't allow resolutions greater than 4K wide. Ice Lake CPUs won't have this limitation probably (support HBR3 and DSC so will allow 6K probably).

Is the max pixel depth at 5K 8 bpc? Is 4K max 10 bpc or 12 bpc? Are there YCbCr and chroma subsampling options? If there were chroma subsampling options, then you could get 6K. HBR3 should allow 6K 60Hz YCbCr 10bpc 4:2:0. Since you're already using a custom EDID, you could try adding chroma sub sampling options. My script has options to remove YCbCr and chroma sub sampling features. It has no options to add them but it shouldn't be too difficult to change. Maybe the display won't support YCbCr but it would be interesting to try.

Does GPU-Z show the proper link rate (8.1 Gbps for 5K 8bpc, 5.4 Gbps for 4K 10 bpc) and lanes (4 lanes).

I wasted a lot of time and feel very stupid for not knowing why the Pro Display didn't pass through its EDID when connected to the 1080 Ti, but did when connected to the motherboard's integrated graphics.
Did you try all the ports? Switch the ports used by the Dell and XDR displays? If you had a Hackintosh (running High Sierra because Nvidia cards newer than Kepler don't work with later macOS), I wonder if the EDID would come through? What about Linux? BIOS settings (disable CSM)?
 
  • Like
Reactions: Executor

Bradleyone

macrumors regular
Jul 7, 2015
232
262
Sydney, Australia
Thank you for all your comments and tips and things to try!

They've helped get me to where I'm more or less content with the XDR monitor as a temp stand-in for a high res PC gaming monitor until my ASUS PG27UQX arrives, if that model is ever released (and people gave Apple a hard time for taking time to deliver on an announced monitor).

The Pro Display gives a great image for games in 4K, sharp as a tack even spread over 32 inches.

Why the 1080 Ti acted as a gatekeeper for the XDR and "intercepted" its EDID is still a mystery. Yes, I'd tried each of the 3 DisplayPorts (at first I thought NVD0000 might refer to port 1, but they were all the same). I do have a Hackintosh but it's complicated in that the Nvidia card is part of a watercooling loop and removing it to put in another machine would be a major operation. If I can find a spare M.2 I might install a Linux to see what it reads the monitor as.

But I really don't want to waste too much more time now that it's working and that it won't be a long term solution anyway. I'm putting it down to either (a) a bug on Nvidia's part where an unexpected DisplayPort->Thunderbolt hybrid monitor beast caught it off guard, or (b) a freak accident on my part where I'd pulled a cable at a crucial id'ing stage and somehow locked in a bad read?

4C3B30CF-F8FB-445A-9AF3-AE399ACDFEDA.jpeg
 
Last edited:

joevt

Contributor
Jun 21, 2012
6,689
4,086
Thank you for all your comments and tips and things to try!

They've helped get me to where I'm more or less content with the XDR monitor as a temp stand-in for a high res PC gaming monitor until my ASUS PG27UQX arrives, if that model is ever released (and people gave Apple a hard time for taking time to deliver on an announced monitor).

The Pro Display gives a great image for games in 4K, sharp as a tack even spread over 32 inches.

Why the 1080 Ti acted as a gatekeeper for the XDR and "intercepted" its EDID is still a mystery. Yes, I'd tried each of the 3 DisplayPorts (at first I thought NVD0000 might refer to port 1, but they were all the same). I do have a Hackintosh but it's complicated in that the Nvidia card is part of a watercooling loop and removing it to put in another machine would be a major operation. If I can find a spare M.2 I might install a Linux to see what it reads the monitor as.

But I really don't want to waste too much more time now that it's working and that it won't be a long term solution anyway. I'm putting it down to either (a) a bug on Nvidia's part where an unexpected DisplayPort->Thunderbolt hybrid monitor beast caught it off guard, or (b) a freak accident on my part where I'd pulled a cable at a crucial id'ing stage and somehow locked in a bad read?
Are there any options under "Output color format" other than "RGB"? I think Lyot only had RGB for 6K. So I don't expect to see YCbCr.

What's the max bit depth for 5K? I think it will be 8bpc. What's the link rate for 5K? I think it should be 8.1 Gbps.

How does the ASUS PG27UQX do 4K at 144Hz? Dual cable? Overclocking? Chroma sub sampling? or DSC? ASUS did dual cable 144 Hz before. It didn't support g-sync with dual cable mode. DSC would reduce the number of compatible GPUs. Does overclocking just mean reduced blanking? With no blanking 3840x2160 x 144Hz x 24bpp=28.67Gbps is > HBR3 max of 25.82 Gbps so that can't be it. I don't think you can output a higher link rate than HBR3 8.1 Gbps, can you? I saw a Youtube video that showed 3840x2160@144Hz 10bpc YCbCr422 (average 20bpp) which means the answer is chroma sub sampling (plus a minor amount of extra reduced blanking since CVT-RB2 has too much blanking for 4K 144Hz 20bpp using HBR3 with no DSC).

I'm still curious if the XDR could support YCbCr with or without chroma subsampling. I could make an EDID for someone to try.
 

Bradleyone

macrumors regular
Jul 7, 2015
232
262
Sydney, Australia
Are there any options under "Output color format" other than "RGB"? I think Lyot only had RGB for 6K. So I don't expect to see YCbCr.

What's the max bit depth for 5K? I think it will be 8bpc. What's the link rate for 5K? I think it should be 8.1 Gbps.

On this setup, the link rate maxes out at 5.4 Gbps and all capabilities reduced to fit. The bit depth becomes 6 bpc and the output color format entirely disappears from the NVidia box when I set it to 5K.

824B044E-B3BA-485B-AA82-C0A611A738BB.jpeg


How does the ASUS PG27UQX do 4K at 144Hz? Dual cable? Overclocking? Chroma sub sampling? or DSC?

The PG27UQX, when announced last July, was meant to be a slightly enhanced version of the PG27UQ, with slightly more FALD zones (same as the Pro Display XDR I believe) and a redesigned fan to reduce noise. However, it's been so long I wonder if they're waiting to include more.

Anyway, the PG27UQ does 4K at 120Hz, can be overclocked to 144, via a single DisplayPort 1.4 cable.

SDR:
98Hz, RGB (4:4:4), 10-bit color depth
120Hz, RGB (4:4:4), 8-bit color depth
144Hz, YCbCr 4:2:2, 8-bit color depth

HDR:
98Hz, RGB (4:4:4), 10-bit color depth
120Hz, YCbCr 4:2:2, 10-bit color depth
144Hz, YCbCr 4:2:2, 10-bit color depth

I'm still curious if the XDR could support YCbCr with or without chroma subsampling. I could make an EDID for someone to try.

I could try it on a Mac Pro + Radeon Vega II + Pro Display XDR if you "remind me" how to get the EDID into a fresh install of Catalina and test the outcome. (I last messed around with EDIDs trying to get an early Dell 4K to work under Mavericks on a Hackintosh in 2014.)
 
Last edited:

joevt

Contributor
Jun 21, 2012
6,689
4,086
On this setup, the link rate maxes out at 5.4 Gbps and all capabilities reduced time to fit. The bit depth becomes 6 bpc and the output color format entirely disappears from the NVidia box when I set it to 5K (only RGB was available in 4K).
These are all the timings that the display should support:
Code:
2560x1440@59.999Hz 89.818kHz 237.12MHz  h(8 32 40 +)  v(43 8 6 -)

3840x2160@47.952Hz 134.696kHz 528.01MHz  h(8 32 40 +)  v(59 8 582 -)
3840x2160@47.999Hz 134.686kHz 527.97MHz  h(8 32 40 +)  v(56 8 582 -)
3840x2160@50.000Hz 134.699kHz 528.02MHz  h(8 32 40 +)  v(8 8 518 -)
3840x2160@59.939Hz 134.684kHz 527.96MHz  h(8 32 40 +)  v(9 8 70 -)
3840x2160@60.000Hz 134.699kHz 528.02MHz  h(8 32 40 +)  v(7 8 70 -)

5120x2880@47.952Hz 179.531kHz 933.56MHz  h(8 32 40 +)  v(850 8 6 -) 16:9
5120x2880@48.000Hz 179.567kHz 933.75MHz  h(8 32 40 +)  v(847 8 6 -) 16:9
5120x2880@50.000Hz 179.550kHz 933.66MHz  h(8 32 40 +)  v(697 8 6 -) 16:9
5120x2880@59.940Hz 179.519kHz 933.50MHz  h(8 32 40 +)  v(101 8 6 -) 16:9
5120x2880@60.000Hz 179.579kHz 933.81MHz  h(8 32 40 +)  v(99 8 6 -) 16:9 preferred

6016x3384@47.952Hz 210.940kHz 1285.89MHz  h(8 32 40 +)  v(1001 8 6 -) 16:9
6016x3384@48.000Hz 210.960kHz 1286.01MHz  h(8 32 40 +)  v(997 8 6 -) 16:9
6016x3384@50.000Hz 210.950kHz 1285.95MHz  h(8 32 40 +)  v(821 8 6 -) 16:9
6016x3384@59.940Hz 210.928kHz 1285.82MHz  h(8 32 40 +)  v(121 8 6 -) 16:9
6016x3384@60.000Hz 210.960kHz 1286.01MHz  h(8 32 40 +)  v(118 8 6 -) 16:9 preferred
933MHz for 5K 60Hz RGB 6bpp is possible for HBR2 (max for this pixel format and link rate is 960 MHz).

You might be the first to show a single cable link rate limit of HBR2 for the XDR display. I would like to see an AGDCDiagnose output with the DisplayPort DPCD registers showing this limit. That would come from macOS using a GPU that doesn't support DSC, preferably a GPU that doesn't have Thunderbolt ports (though I would like to see output for a Thunderbolt 3 connection as well).

Does your 1080 Ti have the latest firmware? Has it worked at 8.1 Gbps with a DisplayPort 1.3 or 1.4 display before? This linked update might be just a UEFI firmware update which probably doesn't affect operation in Windows. But you might want to make sure driver and firmware are up to date anyway.

Anyway, the PG27UQ does 4K at 120Hz, can be overclocked to 144, via a single DisplayPort 1.4 cable.

SDR:
98Hz, RGB (4:4:4), 10-bit color depth
120Hz, RGB (4:4:4), 8-bit color depth
144Hz, YCbCr 4:2:2, 8-bit color depth

HDR:
98Hz, RGB (4:4:4), 10-bit color depth
120Hz, YCbCr 4:2:2, 10-bit color depth
144Hz, YCbCr 4:2:2, 10-bit color depth
If it can do 144Hz YCbCr422 10bpc HDR then it should be able to do the same for SDR?

I could try it on a Mac Pro + Radeon Vega II + Pro Display XDR if you "remind me" how to get the EDID into a fresh install of Catalina and test the outcome. (I last messed around with EDIDs trying to get an early Dell 4K to work under Mavericks on a Hackintosh in 2014.)
It would involve modifying an override file at: /System/Library/Displays/Contents/Resources/Overrides/DisplayVendorID-610/DisplayProductID-ae2e
But it's more complicated than that because it has a corresponding mtdd file.

Anyway, macOS doesn't have an option to change Output Color Format. I suppose you can tell its working in macOS if you can get higher timings working with a single cable than you would expect from RGB or 4:4:4.

To force single link connection with a Vega II, you need to use a dock or combine the Moshi cable with a USB-C to DisplayPort adapter (one supporting HBR3 such as the Cable Matters adapter).

With the 1080 Ti in Windows, you already know how to override the EDID, and the NVIDIA Control Panel has options to change the color format and depth. That would be the best way to test chroma sub sampling support.

I'll make an EDID anyway.
 

Bradleyone

macrumors regular
Jul 7, 2015
232
262
Sydney, Australia
I ran the Nvidia firmware updater and it did indeed find it needed to update, which I let it do. There didn't seem to be any change in Windows afterwards. Link rate still capped at 5.4.

I'll see how viable turning this PC into a Hackintosh would be, to run those AGDCDiagnose tools.

I have started gaming on the Pro Display XDR in 4K 10-bit and the colors are deliciously rich, even in the 10 year old game engine of the MMO I play. It's a very immersive monitor, even for a task outside its design spec.

If you think this crotchety 1080 Ti setup would be useful for the custom EDID, feel free.
[automerge]1580885765[/automerge]
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.