Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mattspace

macrumors 68040
Original poster
Jun 5, 2013
3,161
2,865
Australia
Just checking to see how anyone using a W6600 is finding it, as I toy with ideas for setting up a new project machine to experiment with OpenCore and Monterey, and look for a reliable card with at least 3x DP outputs.

8GB VRAM, same as my rx580, seems to be better performance on paper, only requires a single 6pin power... PCI4x8... but the cMP slot is 2x16, so 3x8, or 4x4... so I assume the card is going to be slot constrained?

Anyone who has one running, and would like to chip in with how it's been?
 

h9826790

macrumors P6
Apr 3, 2014
16,614
8,546
Hong Kong
Just checking to see how anyone using a W6600 is finding it, as I toy with ideas for setting up a new project machine to experiment with OpenCore and Monterey, and look for a reliable card with at least 3x DP outputs.

8GB VRAM, same as my rx580, seems to be better performance on paper, only requires a single 6pin power... PCI4x8... but the cMP slot is 2x16, so 3x8, or 4x4... so I assume the card is going to be slot constrained?

Anyone who has one running, and would like to chip in with how it's been?
The cMP only has 2x16 and 2x4 slots. No 3x8 or 4x4.

If you install a PCIe 4.0x8 card into the 2x16 slot, then the end result is that the card will populate at PCIe 2.0 x8. That's it.
 

mattspace

macrumors 68040
Original poster
Jun 5, 2013
3,161
2,865
Australia
The cMP only has 2x16 and 2x4 slots. No 3x8 or 4x4.

If you install a PCIe 4.0x8 card into the 2x16 slot, then the end result is that the card will populate at PCIe 2.0 x8. That's it.
Ahh OK, I misunderstood - i assumed it was an expession of the overall bandwidth, so a 3x8 card won't provide the same throughput, or lane-up when connecting to the lower pci spec, given they're all x16 mechanical.

*sigh* well that rather rains on an otherwise fine plan. 😢
 

mode11

macrumors 65816
Jul 14, 2015
1,309
972
London
I'm running a 6600XT in my machine, which is presumably similar?

From what I understand, slot bandwidth has a marginal effect on GPU performance, especially with a midrange card like this, and being driven by slow CPUs. I wouldn't worry about it. These cards are supposed to be 2x the performance of an RX580, and use about 50W less power. Haven't benchmarked it, but it's been reliable and IIRC has better performance in Unreal Editor than the 580. Does need the BIOS flashing in a more modern PC first though, using Syncretic's script.
 
  • Like
Reactions: mattspace

h9826790

macrumors P6
Apr 3, 2014
16,614
8,546
Hong Kong
Ahh OK, I misunderstood - i assumed it was an expession of the overall bandwidth, so a 3x8 card won't provide the same throughput, or lane-up when connecting to the lower pci spec, given they're all x16 mechanical.

*sigh* well that rather rains on an otherwise fine plan. 😢
PCIe 2.0 x16 has the same bandwidth as PCIe 3.0 x8. However, a PCIe switch will be require inside the connection to do the conversion.

And there is no such switch between the W6600 and the slot. Therefore, PCIe 2.0 x8 is what you will get.

But as mode11 said, that only limits the connection bandwidth, not necessary the GPU performance. If there is a large amount of data need to send back and forth to the card, then of course limiting bandwidth is an issue.

However, if your work is more GPU compute power limiting, then may be the performance penalty isn't that noticeable.
 
  • Like
Reactions: mattspace

mattspace

macrumors 68040
Original poster
Jun 5, 2013
3,161
2,865
Australia
Thanks folks - the workload is running multiple (3-4) displays in 10 bit colour for photo & graphics work, so I'm not sure where that fits in the "using lots of bandwidth" stakes.

for one moment I looked at the 6700 series, but of course they're not supported...

It really seems like the choke point is Monterey won't get any newer GPU support, and GPUs Monterey supports have gone / will go out of production. Becoming unsure as to where to turn... a 6800 with Pixlas mod?
 

mattspace

macrumors 68040
Original poster
Jun 5, 2013
3,161
2,865
Australia
Just while I think about it... a PCI2 x8 card is going to have the same bandwidth available as a card in a TB3 eGPU, isn't it (PCI 3 x4)?
 

mode11

macrumors 65816
Jul 14, 2015
1,309
972
London
Thanks folks - the workload is running multiple (3-4) displays in 10 bit colour for photo & graphics work, so I'm not sure where that fits in the "using lots of bandwidth" stakes.

for one moment I looked at the 6700 series, but of course they're not supported...

It really seems like the choke point is Monterey won't get any newer GPU support, and GPUs Monterey supports have gone / will go out of production. Becoming unsure as to where to turn... a 6800 with Pixlas mod?
6800 doesn’t need Pixlas - only the XT version does.
 

mode11

macrumors 65816
Jul 14, 2015
1,309
972
London
I believe it’s safe, though good idea to use an EVGA PowerLink to make sure.

Others will be able to confirm, but pretty sure the RX6800 doesn’t need Pixlas. I was surprised myself. The XT and 6900 do, though.

Edit: MacVidCards say it's safe: https://www.macvidcards.eu/products/amd-radeon-rx-6800-16-gb-gddr6

"The fastest ever graphics card for Mac Pro 4,1/5,1 exclusively from MacVidCards Europe."

"Works on internal power using two 8-pin PCIe connectors. No power supply modifications or SATA power needed. Real power usage is about 225W well within Mac Pro power limits."

YT video discussing all this:
 
Last edited:
  • Like
Reactions: Gustav Holdoff

h9826790

macrumors P6
Apr 3, 2014
16,614
8,546
Hong Kong
Just while I think about it... a PCI2 x8 card is going to have the same bandwidth available as a card in a TB3 eGPU, isn't it (PCI 3 x4)?
That's correct indeed. And this is one of the reason why eGPU can't quite perform the same as internal dGPU.

Thanks folks - the workload is running multiple (3-4) displays in 10 bit colour for photo & graphics work, so I'm not sure where that fits in the "using lots of bandwidth" stakes.

for one moment I looked at the 6700 series, but of course they're not supported...

It really seems like the choke point is Monterey won't get any newer GPU support, and GPUs Monterey supports have gone / will go out of production. Becoming unsure as to where to turn... a 6800 with Pixlas mod?
If just driving the monitors, that should be a piece of cake. But I am not sure how demanding your graphic work is. It's really hard to tell if you will have good experience util you try it.

For 6800, of course, the dual 8pin looks scary. However, it's the actual power draw pattern to define if Pixlas mod is required.

e.g. my Radeon VII is also a dual 8pin card, but if I connect the cables correctly, and limit its max power draw, I can use only mini 6pins to power it, even running Furmark.

And since the 6800's power draw (non heavily factory overclocked version) is quite a bit lower than the Radeon VII, by default, it can be powered by the mini 6pins only as long as you balanced the power draw between the 8pins (e.g. via EGVA PowerLink as mode11 mentioned).

It doesn't mean that the power draw will always stay within the 75W official limit for each mini 6pin. However, from the practical point of view, it's proved can work without any trouble.
 
  • Like
Reactions: mattspace

mattspace

macrumors 68040
Original poster
Jun 5, 2013
3,161
2,865
Australia
That's correct indeed. And this is one of the reason why eGPU can't quite perform the same as internal dGPU.

Yeah, I was aware of that, but I guess my thoughts were project mac pro, vs final gen intel mini with eGPU.

If just driving the monitors, that should be a piece of cake. But I am not sure how demanding your graphic work is. It's really hard to tell if you will have good experience util you try it.

It seems to be coping with an rx580, though that is on an x16 connection.

For 6800, of course, the dual 8pin looks scary. However, it's the actual power draw pattern to define if Pixlas mod is required.

I have a spare power supply - I'm highly tempted to take it to a professional to do the version of pixlas that solders the cables in at the motherboard connector, rather than using vampire taps.

e.g. my Radeon VII is also a dual 8pin card, but if I connect the cables correctly, and limit its max power draw, I can use only mini 6pins to power it, even running Furmark.

And since the 6800's power draw (non heavily factory overclocked version) is quite a bit lower than the Radeon VII, by default, it can be powered by the mini 6pins only as long as you balanced the power draw between the 8pins (e.g. via EGVA PowerLink as mode11 mentioned).

I've been looking at Sapphire branded cards - the Pulse 580 I have seems to hav been pretty stable, so I'm happy going wit them again. What's the key figure to look at on the stats for determining the power use relative to the available potential power?

It doesn't mean that the power draw will always stay within the 75W official limit for each mini 6pin. However, from the practical point of view, it's proved can work without any trouble.

Good to know, I'll keep it in mind.
 

startergo

macrumors 601
Sep 20, 2018
4,786
2,190
Just checking to see how anyone using a W6600 is finding it, as I toy with ideas for setting up a new project machine to experiment with OpenCore and Monterey, and look for a reliable card with at least 3x DP outputs.

8GB VRAM, same as my rx580, seems to be better performance on paper, only requires a single 6pin power... PCI4x8... but the cMP slot is 2x16, so 3x8, or 4x4... so I assume the card is going to be slot constrained?

Anyone who has one running, and would like to chip in with how it's been?
I have it installed in my hack:
Radeon PRO W6600X:

Chipset Model: AMD Radeon PRO W6600X
Type: GPU
Bus: PCIe
Slot: Slot-3
PCIe Lane Width: x16
VRAM (Total): 8 GB
Vendor: AMD (0x1002)
Device ID: 0x73e3
Revision ID: 0x0000
ROM Revision: 113-D5330200-100
Metal Support: Metal 3
 

h9826790

macrumors P6
Apr 3, 2014
16,614
8,546
Hong Kong
I've been looking at Sapphire branded cards - the Pulse 580 I have seems to hav been pretty stable, so I'm happy going wit them again. What's the key figure to look at on the stats for determining the power use relative to the available potential power?
After you install the card into the cMP, you can check the power draw (or current) of the PCIe slot, PCIe Booster A, and PCIe Booster B.

From the link in my last post, you can see how I analysis the power draw of these three items.

Ideally, you want all of them stay below 75W. In fact, for PCIe slot, that must be within 75W. So far, there is no evidence that the slot can provide more than 75W safely for long period of time.

But in most cases, the power draw isn't that balance between the two 8 pins. May be one of them draw 100W, but the other just draw 50W. That's why we recommend a "power balancing connection" (e.g. via EVGA PowerLink. Or simply just connect the cables via a 6pin / 8pin connector)

Furthermore, make sure each Booster's power draw is with 120W. TBH, this is very hard to find out, SMC can only report up to 7.99A for each PCIe booster. So 12V x 8A = 96W. If you see anything stay at 96W, you better stop the test. Because that may means the card is actually drawing 115W from a mini 6pin etc. That 120W is an estimate number which users discovered that the SMC may command a hard shutdown. That's why we suggest not going anything near that number.

By considering you can only read anything up to 96W. As long as you keep the power draw below that, then it should be safe. I "overdraw" (~90W) the mini 6pins for many years, no damage. And if we assume Apple follow the PCIe standard, then the mini 6pin should able to handle way more than 75W. Overdraw a little bit from there shouldn't be a problem.

Anyway, if you go that route, please post whatever you observed (the most useful number are capture in your real workflow, under max stress to the GPU. If can't do that, use benchmarks to simulate high demand). And I can help to tell if there is any immediate danger.
 

pattielipp

macrumors member
Apr 2, 2014
30
15
Charleston, South Carolina
I've been running a w5700 for a while now. It works without issue. I am running pixlas though. side note, the USBC port works for display out and/or USB, including passing through usb2.0 when being used for display.
 

mattspace

macrumors 68040
Original poster
Jun 5, 2013
3,161
2,865
Australia
I've been running a w5700 for a while now. It works without issue. I am running pixlas though. side note, the USBC port works for display out and/or USB, including passing through usb2.0 when being used for display.

That's interesting - are you seeing the same issues @Gustav Holdoff reported with unexplained kernel tasks?

I'm curious, do displays with USB-C inputs connected to that USB-C port do the disconnecting from the system thing (and its associated screwing up of application windows) that DisplayPort does?
 

Gustav Holdoff

macrumors regular
Oct 23, 2020
187
77
It is likely that the behavior of the W5700 is due to something related to my hardware
it is likely that this is a specific defect of my AMD Radeon Pro W5700
I no longer pay attention to this kernel task-
it degrades the CPU geekbench and slightly increases the CPU temperature
And yes, this process only occurs if I use a computer with a W5700 - and it disappears if I install any other GPU: Rx570, Rx580, W5500 or W6800
although Startergo tried to help me fix this bug and wrote some SSDTs for me, and I even managed to test one day of work without a kernel task - fps when working with twinmotion improved, and the geekbench became as it should be 10% better
but after turning off the computer for the night in the morning, the kernel task appeared again-
probably multiple NVRAM and SMC reset gave some kind of temporary result
regarding the performance difference:
W5500 is slightly better than 580, only 130W consumption (but you can’t work in mojave - or I just didn’t set up the config so in-depth - I just don’t need mojave - only rescue).
and with W5500 you can insert into second 16x slot RAID adapter for 4x NVME ssd's
if you put W5700 or W6800 - then this adapter will close the air for cooling the GPU - in this case I put a special 30cm cable for connecting the card so that it does not interfere with the GPU fan
6xxx obviously have advantages over 5xxx at least because they have path tracing enabled in UE and twinmotion for Windows
and all 5xxx cards have 8gb and modern minimum requirements are already 12 or even 16gb, but W6800 with 32gb is too expensive
there is not much difference for working in macos, since macos does not use all the resources of the GPU anyway (at least in my case)
 

Attachments

  • IMG_4843.jpg
    IMG_4843.jpg
    381.6 KB · Views: 132
  • IMG_4844.jpg
    IMG_4844.jpg
    366.4 KB · Views: 73
Last edited:
  • Like
Reactions: mattspace

avro707

macrumors 68000
Dec 13, 2010
1,754
971
What’s the power consumption of the W6800? Definitely requires pixlas mod?

I’ve also seen the W6600 for sale locally but no X version, the X is the Apple MPX module as far as I know.
 

badferday

macrumors newbie
Apr 16, 2024
1
0
I have it installed in my hack:

This is of interested to me, as I have two Radeon Pro W6600 cards (non-XTs) and I have not been able to get either one working on my cMP 5,1 (which is a flashed 4,1).

Did you have to do anything to get yours working? When I have one installed (haven't even bothered trying with both), I don't even get to the happy Mac chime.

Some more information: I tried with my RX560 test card (which works alone no problems) and one W6600, and I got locked in a happy Mac chime reboot cycle. At least the computer sounded extremely happy. Haha.

This is the exact model of my card: https://www.amd.com/en/products/professional-graphics/amd-radeon-pro-w6600

Both cards are verified working in my PC. I got the boot screen with both, so they def work. I didn't connect either to a monitor, but I figure that isn't needed.

Any thoughts on this?
 
Last edited:

startergo

macrumors 601
Sep 20, 2018
4,786
2,190
This is of interested to me, as I have two Radeon Pro W6600 cards (non-XTs) and I have not been able to get either one working on my cMP 5,1 (which is a flashed 4,1).

Did you have to do anything to get yours working? When I have one installed (haven't even bothered trying with both), I don't even get to the happy Mac chime.

Some more information: I tried with my RX560 test card (which works alone no problems) and one W6600, and I got locked in a happy Mac chime reboot cycle. At least the computer sounded extremely happy. Haha.

This is the exact model of my card: https://www.amd.com/en/products/professional-graphics/amd-radeon-pro-w6600

Both cards are verified working in my PC. I got the boot screen with both, so they def work. I didn't connect either to a monitor, but I figure that isn't needed.

Any thoughts on this?
You need to flash the vBios of the card with @Syncretic patch to be even able to boot on the cMP. For this you need a PC or a thunderbolt to pcie expansion (plus a PC or a MacBook Pro with Bootcamp) to flash the vBios.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.