Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

vir2l2k

macrumors member
Original poster
Apr 2, 2019
60
48
will all nvidia gpus as the radeon 7 work with mac pro or is there just the ridiculous 580x vs the even more ridiculous vega 2?
 

SecuritySteve

macrumors 6502a
Jul 6, 2017
943
1,070
California
NVIDIA drivers still are not out for Catalina, which will be the OS that ships with the new Mac Pro. It is possible you will never see NVIDIA on Mac again until their squabble with Apple is settled.
 

h9826790

macrumors P6
Apr 3, 2014
16,617
8,549
Hong Kong
Mac does not support Nvidia GPU AT ALL because Apple blocked Nvidia driver.

I am quite sure the GTX680 and K5000 etc can still work on the 7,1.

Of course, that's not what we want. However, some Nvidia GPUs are supported. At least, driver still exist in the latest Catalina.

But CUDA 100% won't work on the 7,1 in macOS. There is definitely no CUDA driver available for 7,1 (again, in macOS only).
 

Coyote2006

macrumors 6502a
Apr 16, 2006
512
233
I just wonder why there are NO GPUs on the market with TB3 output. Is there any reason for that? Might it be that Nvidia wanted to release them and Apple blocked this by ending Nvidia support on macOS?

If there were TB3 GPUs on the market, the 5,1s out there would get another boost to survive some more years.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,368
3,936
My impression, unfortunately, is that the OP is exactly right...

It is most likely to be a bit in the middle. Apple has some rules for graphics/computation drivers/hardware that Nvidia isn't/doesn't follow. So Apple doesn't sign the drivers ( just like if break an app store rule, Apple doesn't sign app either ).

The two camps are split. Nvidia does finger pointing at Apple stating that wrote drivers they think will work but Apple won't sign. Nvidia also engages in a public finger pointing exercise as to it is "Apple's fault". ( let the social media army wear Apple down).

On the other side Apple isn't budging on whatever rule is ( e..g., Metal not second class citizen, kernel security restrictions .... whatever , but something(s) tied to key strategic directions with Apple ). Nvidia publicly pointing fingers only digs a deeper hole in that context. If it is an overall Apple strategic objective ( not simply a Mac Pro product one).

Add in GPU cards (eGPU or "in box" ) are not a primary driver of Mac market ( let alone Apple). Mac Add in cards are not a primary driver for Nvidia either. Two companies both flush with cash in a shoving match in a submarket that isn't strategic. Nvidia is probably not a boy scout who helps old ladies across the road here. (they have a track record of variations of "embrace, extend , extinguish like tactics and screw your partner moves ). Apple is also probably doing some substantial risk shifting and/or workload onto Nvidia also.

Intel looking to becoming a significant player in the discrete GPU market means Apple doesn't have any choices for a second dGPU vendor for design bake-offs competitions. Apple getting into the GPU business themselves reduces that "have to deal with Nvidia" pressure even more. Apple is going to have options if don't sign Nvidia drivers anymore.
 
  • Like
Reactions: AMP12345 and n0-0ne

Pro7913

Cancelled
Sep 28, 2019
345
102
that's very unlikely to be right.

Then do you really think that Nvidia stops supporting macOS? Of course not. Only Apple can block Nvidia driver to work with macOS.
[automerge]1573315585[/automerge]
I am quite sure the GTX680 and K5000 etc can still work on the 7,1.

Of course, that's not what we want. However, some Nvidia GPUs are supported. At least, driver still exist in the latest Catalina.

But CUDA 100% won't work on the 7,1 in macOS. There is definitely no CUDA driver available for 7,1 (again, in macOS only).

I highly doubt that anyone wanna use old Nvidia GPU on Mac Pro 2019.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,368
3,936
I just wonder why there are NO GPUs on the market with TB3 output. Is there any reason for that?

While Thunderbolt was primarily controlled by Intel and Intel had a dominate portion of the overall GPU market ( extremely skewed to the iGPU side but in terms of numbers/volume quite high) the smaller players really didn't want to give Intel more leverage. AMD engaged in a couple of "anti Thunderbolt" campaigns so there was about zero chance they'd add it to their base reference designs. Nvidia was basically in the same boat. (the Smartphone focused GPUs really had about zero need for Thunderbolt ).


Additionally it also raised bandwidth requirements to the card ( e.g. pragmatically would need to trim some subset of x4 PCI-e from the x8-16 feed to the card ).

Finally, the rigid dogma that add in cards had to be constrained to 1980's vintage style connector physical form factors.

Three reversals coming over next several years.

1. More bandwidth to card PCI-e v4/5 leaves options of more to share.
2. Much of Thunderbolt infrastructure has been handed off to USB-IF. ( So Intel (and Apple) isn't the only primary driver. ( USB4 coming. Probably not mainstream discount cards but wired headsets would have a broader standard. and broader iGPU adoption eventually. )

3. DisplayPort picked up the infrastructure to create a close derivative.
https://vesa.org/press/vesa-publish...-for-4k-hdr-and-virtual-reality-applications/


Might it be that Nvidia wanted to release them and Apple blocked this by ending Nvidia support on macOS?

ROTFLMAO .... errr no. Nvidia was about last most of DisplayPort iterations. They aren't trying to do it all. They were behind a semi-proprietary link that also 'forked' off of USB Type C physical connector call Virtual Link. all the while after the work that Intel and Apple primary did for USB-IF was plain as day already underway.

Nvidia isn't ahead of the TB on card curve here at all. They've been engaged at throwing curve balls at it.
[ which again isn't going to get them a friendly Christmas card from Apple for "best partner of the year" sentiments. It is more so just digging a deeper hole. ]




If there were TB3 GPUs on the market, the 5,1s out there would get another boost to survive some more years.

Outside of cute hacks that "happen to work" if don't blow on them too hard, the 5,1 lacks basic support need for the three inputs a fully supported TB add in card needs. ( GPIO , PCI-e , and Display).

Booting into Windows and then into macOS isn't something Apple is going to sign off on drivers on. Or stuff card in but don't generate any hot plug-unplug (TB network reconfiguration events ) . ... Apple isn't going to sign off on those either. A "as long as you don't blow on it too hard " hacks aren't likely to get elevated to supported configurations.
 
  • Like
Reactions: Coyote2006

tommy chen

macrumors 6502a
Oct 1, 2018
907
388
Then do you really think that Nvidia stops supporting macOS? Of course not. Only Apple can block Nvidia driver to work with macOS.

NVIDIA has a full apple master certificate for her drivers!
without they can't update drivers for sierra and highsierra

for mojave and up the drivers must be complete rewriten

and this is to much work for NVIDIA for the less sales on macOS

my 2 cents!
 

bsbeamer

macrumors 601
Sep 19, 2012
4,311
2,704
Wait for MP7,1 release. I'm sure someone will throw the I/O card from MP7,1 into an MP5,1 to see what is and isn't possible.
 
  • Like
Reactions: H2SO4

Pro7913

Cancelled
Sep 28, 2019
345
102
NVIDIA has a full apple master certificate for her drivers!
without they can't update drivers for sierra and highsierra

for mojave and up the drivers must be complete rewriten

and this is to much work for NVIDIA for the less sales on macOS

my 2 cents!

They already made a driver for macOS Mojave and yet they didnt release the driver because of Apple.
 

goMac

Contributor
Apr 15, 2004
7,662
1,694
Wait for MP7,1 release. I'm sure someone will throw the I/O card from MP7,1 into an MP5,1 to see what is and isn't possible.

The IO card on the 2019 Mac Pro looks to have a custom pin layout so good luck with that.

There are pins along the edge of the card slot.

It's likely so things like DisplayPort can be routed onto the card.

Also why multiple IO cards don't seem to be an option. Slot 1 and the IO card are a special pair.
 
  • Like
Reactions: OkiRun
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.