Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

kvic

macrumors 6502a
Sep 10, 2015
516
459
It's no longer 1980s, 1990s. Hardware drivers, especially high-profile HW like GPUs, are designed to be very portable across different ISAs. Radeon Pro's have been running on ARM64 PC for a couple of years at least.

At the end of the day, it's Apple doesn't want AMD GPUs on its ARM Mac's (yet).

eMAG__DSF1414_575px.jpg


source: anandtech review
 

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
It's no longer 1980s, 1990s. Hardware drivers, especially high-profile HW like GPUs, are designed to be very portable across different ISAs. Radeon Pro's have been running on ARM64 PC for a couple of years at least.

Last I looked, AMD only offered driver downloads for X86. Are you saying the x86 driver package is a turnkey works on ARM driver as well?

*edit* From AMD's official posts on their support forums, an ARM driver is only available upon application to AMD, with details of the company, project, and revenue.

So, it's only custom drivers, on a case-by-case basis.

Which again, could have AMD putting Apple over a barrel, and facetiously suggesting that 30% of retail price of the device they're used in, would be a reasonable example for AMD to follow in deciding pricing for their driver IP. 😆
 
Last edited:

kvic

macrumors 6502a
Sep 10, 2015
516
459
Last I looked, AMD only offered driver downloads for X86. Are you saying the x86 driver package is a turnkey works on ARM driver as well?

The linux driver could be compiled for x86_64 and 64-bit ARM.

Take a step back. Why do AMD & Nvidia provide downloads for GPU driver in the first place? It's the convention of x86 WinTel PC...in place for decades.

It's not AMD doesn't provide driver download for ARM computers. There is no such consumer demand. Better to ask what Microsoft going to do with Windows for ARM PC.

I believe Microsoft learned a chapter from Apple. Microsoft wants tight control over Windows ARM devices. They also want to create a tightly guarded wall garden where they could more profitably milk an even larger user base than Apple ecosystem.

So I believe when Microsoft popularises ARM PC (if there is such a day), AMD won't be offering GPU driver downloads either. Instead, you'll get it from Microsoft Updates.
 

Joe The Dragon

macrumors 65816
Jul 26, 2006
1,025
474
The linux driver could be compiled for x86_64 and 64-bit ARM.

Take a step back. Why do AMD & Nvidia provide downloads for GPU driver in the first place? It's the convention of x86 WinTel PC...in place for decades.

It's not AMD doesn't provide driver download for ARM computers. There is no such consumer demand. Better to ask what Microsoft going to do with Windows for ARM PC.

I believe Microsoft learned a chapter from Apple. Microsoft wants tight control over Windows ARM devices. They also want to create a tightly guarded wall garden where they could more profitably milk an even larger user base than Apple ecosystem.

So I believe when Microsoft popularises ARM PC (if there is such a day), AMD won't be offering GPU driver downloads either. Instead, you'll get it from Microsoft Updates.
app store only failed on windows and they even gave up trying to force games to use it. Look Microsoft games are on steam.
Also on windows trying to lockout steam will end up with anti-trust issues.

And apples wall garden is going to be knocked down in the EU soon.
 
  • Like
Reactions: JMacHack

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
It's not AMD doesn't provide driver download for ARM computers. There is no such consumer demand. Better to ask what Microsoft going to do with Windows for ARM PC.

Exactly the same as they're doing with Intel. I suspect Microsoft will treat ARM exactly as they treat any other CPU - they might move some Surface products to ARM, they might make a non-gaming focussed equivalent of the XBox, or an unsubsidised one with multiple display connections, but right now, Microsoft's clear strategy is any device paradigm can become any other device paradigm, just by plugging in the appropriate peripherals.

Far from being "like Apple", I suspect you'll see them do the polar opposite of what Apple does. Apple expects you to buy a tablet, and iMac, with a janky user session state mediated via a internet connection. Microsoft expects you to buy a surface, and an eGPU / dock connected to (multiple) displays, and the same user session just moves to the bigger display area, even if the internet is out.

I believe Microsoft learned a chapter from Apple. Microsoft wants tight control over Windows ARM devices. They also want to create a tightly guarded wall garden where they could more profitably milk an even larger user base than Apple ecosystem.

Walled gardens are yesterday's paradigm. The future of computing is coming from regulators and antitrust prosecutions, not companies. Computing and the way lock-in and platform exclusivity is allowed to proceed, is going to be as tightly regulated and standardised as construction materials and architecture.

Microsoft is going to ride that future by being "any processor, any tech, any paradigm, any sales model you like, we've got a home for you" - they said it themselves "the platform for platform creators". Apple has said it very clearly with their actions and (court revealed) emails regarding 3rd party devs "you owe us your existence, and you only exist to enrich and extend OUR platform".

Apple is now an isolated, technological hermit kingdom. Everything they base their business upon, rests on the whim of regulatory authorities. They have never been more vulnerable, which is why they've diversifying into everything, and anything BUT making and selling products. The future of their fundamental business model of lockin is no longer in their hands.

So I believe when Microsoft popularises ARM PC (if there is such a day), AMD won't be offering GPU driver downloads either. Instead, you'll get it from Microsoft Updates.

Yes, that's possible, Microsoft will make Microsoft branded ARM PCs, but Windows for self-built, and branded PCs will still exist, will still be the majority of the Windows market and will have Intel, AMD and ARM processor options, and you'll be able to use your Gamepass Ultimate subscription, and Office 365 on all of them. AMD will still be selling pro cards for Intel / AMD workstations, and probably doing ARM drivers on a company-by-company basis (perhaps even licencing them to motherboard makers directly).

Whether they prioritise doing so for Apple, or for a price Apple is willing to pay, will be a "enjoy the box of popcorn" moment.

But right now, AMD can tell Apple "no, we're not making you a driver" and Apple have no real option around that.
 
Last edited:

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
app store only failed on windows and they even gave up trying to force games to use it. Look Microsoft games are on steam.
Also on windows trying to lockout steam will end up with anti-trust issues.

Their first attempt failed, the revamped one allowing any payment processor, and devs keeping 100% of the sale price if using a non-Microsoft payment processor, the XBox on Windows store allowing crossplay with XBox games on XBox (which steam etc versions sometimes don't), I suspect the Windows 11 app store is going to be yet another nail in Apples business model coffin.

And apples wall garden is going to be knocked down in the EU soon.

Kinda makes you wonder how much of WWDC is actually going to end up being relevant.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
too early but

Threadripper 3


About 3 years later than this post and ....

https://www.tomshardware.com/news/dell-workstation-threadripper-5000-pro


Which is another not so good sign about any Xeon W-3300 replacement coming. Both Lenovo and now Dell have basically skipped doing a Xeon W-3300 (Ice Lake) workstation offering. Boxx , Pudget , and some other narrow , non-top 5 players are shipping W-3300 but the majors are skipping it. Dell has been a pretty loyal Intel system builder in workstation space and even they are skipping W-3300. That is a statement as to maybe how "not so competitive" a Mac Pro W-3300 would be over the service lifecycle over the next 1-2 years.

So it is somewhat questionable why Apple would take it up at this point. There are low level and firmware differences that don't make sense now that at past the two year transition announcement point. If x86_64 end is in sight then jumping onto a new "horse" at the last minute is questionable allocation of resources.

Neither AMD nor Intel delivered a timely solution. If had go to volume a year ago then there would have been some merits to doing on last iteration. Intel and AMD pushed the die allocations in the the server space (better profits) and this upper end workstation space has drifted a bit. Probably the window needed for whatever Apple Silicon solution Apple was working on to slam the door shut on any substantive new foundational work being done on the legacy x86-64 side for Mac Pro.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
and what is the plan for the high end workstation that needs Multi screen video? useing the TB bus can kill IO for non video needs.
People who may need multi screen more basic video out and don't want USB based video (did apple lock that out yet?)

The TB Bus isn't a particularly relevant constraint. The Display Port goes into the TB controller on DP bus.

The Utlra Studio has 6 TB ports. Even if have to 'sacrifice' four of those solely for DP v.14 DSC ouptput that still leaves two TBv4 ports for I/O.

"...
  • Support for up to four Pro Display XDRs (6K resolution at 60Hz and over a billion colors) over USB-C and one 4K display (4K resolution at 60Hz and over a billion colors) over HDMI
..."

If the new Mac Pro starts off with the Ultra Studio specs as the baseline and then moves up from there in display output (e.g., two HMDI 2.1 ports and a DisplayPort 2.0 port ). How many folks are really going to be doomed being limited to 4 6K displays and 3 more 4-5K ones? Relatively not many.

The part that would be slacking in the M2/M3 Mac Pro SoC isn't number of video ports out, but upgrading the compliance with the current leading specification of the ports. Three points.

A. HDMI 2.1a (stop being capped on some part buying for the Apple TV 4K to provision HDMI )
B. DisplayPort 2.0 ( going to be provisioned on the AMD GPUs by end of the 2022. Apple should be in the mix also. If Mac Pro is pragmatically going to be a 2023 product not having DP v2 is slacking. )
C. Better DP Multiple stream support ( don't need to necessarily blow all of DPv2 bandwidth at just one humongous screen. If can daisy chain 4-5K displays then don't need as many output ports to drive multiple displays. )

It should be relatively easy to do if Apple isn't doing "exact twins of MBP 16" dies to do the Ultra and "double Ultra-class". Dial back on one of the dies Thunderbolt ( shouldn't need more than 6 (and could just do with 4 ). In the space for 2-4 TB controller complexes provision PCI-e v4 and/or DisplayPort. 90+% of the rest of the die is the same ( for twin , quad die packages).

A discrete card would allow to work around their slacking in standards compliance later , but why pay for a slacking card now? ( shipping with the 580 in late 2019 was kind of weak. Doing that again in 2022 would be setting a pattern of being weak. )

The trend lines in monitors is heading toward fewer , bigger , higher resolution monitors driven by one card more so than some kind of discrete card explosion to drive lots of medium scale resolution monitors.


high end workstation with big GPU compute needs?

That doesn't necessarily need to come with a display stack attached to it. A memory of the device driver "family" that plugs into PCI and provides just compute probably should be part of Apple's line up. There is already a M-series family class for PCI devices. That would just a be a more refined subclass that would send certain Metal compute jobs to.

As long as there is a single x16 PCI-e v4 (or better) slot then could couple a PCI-e enclosure break out box to a new Mac Pro and provision it with 2-3 triple wide slots and another 1,400W of power. Would be pulling in MI250 and A

There are some apps that need the display out on the same card that does the compute , but there are others that don't. ( for the latter the Infinity Fabric links point toward distributed loads being OK for some workloads. )


Right now apple sells the mac pro with an max gpu config of twin 64GB Video cards what is the plan to deal with work loads that really need an big ram pool just for video + an big system ram pool?

This niche just plays directly into the "Unified" , Uniform memory approach that Apple is on. In contrast to discrete GPU cards the integrated GPUs can use a huge fraction of the main RAM. If the Ultra class is going to 192 ( or 144) GB ( 16 X 12GB or 16 x 9GB with ECC ) and the quad die is going to 384 (or 288) GB then fitting 128GB "VRAM" workloads on those systems works just fine. That would be a net 100% increase in VRAM workload head room. If started off only using 16GB more per year in VRAM model then the 64 to 128 progression would be 4 years long.
If on a 4-5 year depreciation schedule that isn't going to be a huge issue to buy another one on that cycle.

These abnormally large VRAM footprints is exactly the corner case that Apple is going to point to showing that their approach is works extremely well. It is not a huge upside for the discrete GPU cards.
 
  • Like
Reactions: JMacHack

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
6 ports but how many buses 3? 4? 6?
if only 3 buses then that video can eat the non video I/O bandwidth.

Pragmatically 6. The TB controller is built inside the SoC now. It is kind of even hard to creditability call it a "TB Bus" when it is completely inside the same silicon die as the other cores. There isn't a labeled die floorplan for the details of the Thunderbolt section, but wouldn't be surprising if the "display port" engine is in the same cluster as the TB controller. The PCI-e 'bus' portion of the TB controller cluster is pulling off the internal bus of the chip. That PCI-e controller isn't delieryig to anything other than the that one Tnunderbolt controller.

Apple's TB bus" was a somewhat made up term. The discrete TB controller chips took multiple inputs. a x4 PCI-e innput, one (or in later generations two) Display Port inputs . and GPIO inputs. Clustering those three things together is what Apple slapped "TB Bus" label on. Once all of that is inside the die ... that kind of clustering doesn't make much sense.


This is how Intel put the TB controllers into Gen 11 Tiger Lake SoCs several years ago, before Apple released anything of their own. From 2019:

Blueprint%20Series_May%2016-2019_COMBINED%20FINAL_AnandTech%20%282%29-page-041.jpg




That Fabric (at the top) is the internal inter component communication "bus" for he subsystems of the whole "CPU and more" die. Intel is sharing a DP controller across two TB controllers here, but if built a DP per "TB cluster" then could have two "TB clusters" above. The "TB clusters" are more accurately described as a subsystem on the die that is hooked to the main inter subsystem communcation fabric. The "bus" is all internal to the subsystem... so where is discrete x to y bus connotation have to do with it.


Apple implementation might deviate slightly from that, but probably not very much. Only Apple's is a bit gimped on the plain Mx as they can't do the four display port out part ( and hence fail Thunderbolt 4 qualification). To get the TB four qualification you pretty much need to follow what Intel did. Don't generally do all of the above diagram and won't meet Thunderbolt 4 mandatory requirements.

Thunderbolt has to have DisplayPort "pass through". That has been a feature since the first iteration when TB was "sharing" a port standard with DP ( mini-DP). So yes if plug in a DP only monitor there is a big chunk of silicon behind that port that isn't doing anything as the DP only signal bypasses most of it out to the port.

Pragmatically that is a reasonable trade off because frankly a very high degree of users set ups never go past two monitors. That suppression impact is not that high even with DP only monitors. If have 6 ports ... the impact is even more negligible since really on the edge of having "too many" TB ports.
 
  • Like
Reactions: ZombiePhysicist

goMac

Contributor
Apr 15, 2004
7,662
1,694
AMD was selling every single GPU they could manufacture. What is their motivation to spend time and money on an ARM reference driver, when they're an x86-64 CPU manufacturer, there's no significant non-Apple, non-server ARM computing ecosystem, fabrication world wide is already constrained, and they're selling everything they already make?

ARM macOS includes a different frame buffer system. It's likely not an ARM issue. It's an ARM-macOS-is-different issue. An ARM build of the driver probably wouldn't be difficult if it wasn't for the underlying window server changes.

macOS ARM devices also don't support EFI, so they won't be able to load card firmware. That's another issue. Generic ARM PCs actually tend to support EFI. But not ARM Macs.

Reading the tea leaves at WWDC this year - there were a few comments that pointed to Apple really not wanting AMD back. Craig Federighi even had an aside at The Talk Show that implied the only reason they were considered Mac games again was because they didn't have to deal with third party GPU drivers or hardware anymore.
 

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
Reading the tea leaves at WWDC this year - there were a few comments that pointed to Apple really not wanting AMD back. Craig Federighi even had an aside at The Talk Show that implied the only reason they were considered Mac games again was because they didn't have to deal with third party GPU drivers or hardware anymore.

Federighi is a haircut and a PR face, and The Talk Show is a propaganda outlet, so personally, I wouldn't put any more faith in what's said there, than I would in any other Apple TV-commercial.

It is of course hilarious that Apple says GPUs and drivers get in the way of games, yet the games that Apple literally commissions to have made as exclusives for their own gaming service, are overwhelmingly developed on Windows PCs with (generally Nvidia) GPUs.
 

ZombiePhysicist

macrumors 68030
May 22, 2014
2,795
2,700
Federighi is a haircut and a PR face, and The Talk Show is a propaganda outlet, so personally, I wouldn't put any more faith in what's said there, than I would in any other Apple TV-commercial.

It is of course hilarious that Apple says GPUs and drivers get in the way of games, yet the games that Apple literally commissions to have made as exclusives for their own gaming service, are overwhelmingly developed on Windows PCs with (generally Nvidia) GPUs.
Agree. Apple is delusional.

Apple: We treat video card makers with open hostility and contempt.
Game industry: We build our games on latest tech of card makers to make the games more awesome.
Apple: Hey game makers, we treated you with open hostility and contempt for most of our existence, and treat video card makers that way, but why dont you totally love us and support games on our platform.
Lucy%2C+Charlie+Brown-2926917776.gif

Apple: If you don't, we totally won't change our attitude and continue to treat you with open hostility and contempt.

Yea, with that amazing pitch, how can the game makers refuse!? 🙄
 
  • Haha
Reactions: prefuse07

goMac

Contributor
Apr 15, 2004
7,662
1,694
Federighi is a haircut and a PR face, and The Talk Show is a propaganda outlet, so personally, I wouldn't put any more faith in what's said there, than I would in any other Apple TV-commercial.

He's also the guy in charge of GPU drivers on macOS. So you may think he is an idiot. But in that case, he would be the key idiot in question here. His haircut is the one deciding the driver strategy.

He'd be the one negotiating with AMD. And he's certainly going to know Apple's plans for GPU drivers.

If he thinks AMD driver should not happen, they're not happening. It's his call.

(Also why it was a totally appropriate question for him to answer. GPU drivers and game support is his responsibilities.)

And I'm not even getting to all the Metal sessions at WWDC that didn't really seem to care about AMD GPUs at all, and only cared about Apple Silicon GPUs. It's pretty clear AMD is on the way out, with only a few high end GPUs from AMD getting some scraps this year.
 
Last edited:

goMac

Contributor
Apr 15, 2004
7,662
1,694
Apple: Hey game makers, we treated you with open hostility and contempt for most of our existence, and treat video card makers that way, but why dont you totally love us and support games on our platform.

To be fair: He's not wrong that the Intel Metal drivers have been trash, the Nvidia Metal drivers have been real iffy, and the AMD Metal drivers have been mostly-ok. Same was true even in the OpenGL days (before someone blames Metal.)

The third party drivers have actually been a big issue with bringing games to Mac. So he's not wrong that only having Apple drivers will be a serious simplification. No more testing across multiple vendors with their own weird bugs.

The thing about Metal is that every driver vendor basically supplies their own version of Metal. There's not one Metal. There are four. (Apple, AMD, Nvidia, Intel.)
 
  • Like
Reactions: ZombiePhysicist

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
He's also the guy in charge of GPU drivers on macOS. So you may think he is an idiot. But in that case, he would be the key idiot in question here. His haircut is the one deciding the driver strategy.

He'd be the one negotiating with AMD. And he's certainly going to know Apple's plans for GPU drivers.

If he thinks AMD driver should not happen, they're not happening. It's his call.

(Also why it was a totally appropriate question for him to answer. GPU drivers and game support is his responsibilities.)

What would be an appropriate question for him to answer, is why the Mac Studio was launched with, let's face it, fraudulent claims about graphics performance. What would be an appropriate question for him to answer, is how it looks that obsolete equipment from over a decade ago, wipes the floor with their newest, highest end Apple Silicon GPUs. What would be an appropriate question for him to answer, is why Apple's customers should put up with going years backwards in capability, for Apple's benefit.

BUT, Gruber is a bought man, and The Haircut appearing at the Talk Show Live WWDC is part of that payment, so the questions will be pre-approved etc.
 

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
To be fair: He's not wrong that the Intel Metal drivers have been trash, the Nvidia Metal drivers have been real iffy, and the AMD Metal drivers have been mostly-ok. Same was true even in the OpenGL days (before someone blames Metal.)
And yet, even with the trashy 3rd party drivers, Apple's GPUs are slower, and more expensive for the delta upgrade prices, than third party GPUs.

It's really hard to see the upside here... "the food's terrible, but at least the portions are big"

The third party drivers have actually been a big issue with bringing games to Mac. So he's not wrong that only having Apple drivers will be a serious simplification. No more testing across multiple vendors with their own weird bugs.

The big issue for games developers, is Apple providing under-powered hardware, or demanding in the case of Apple Arcade, that games run on AppleTV, which is a dumpster fire of a device for games, if you ask people who aren't trying to sell you one of them.

No one cares about dealing with different GPUs and drivers - they already do that, it's a part of the development process. That's literally a core competency for them and their tools.

But you can't magically create GPU capability, or magically stop customers buying your Mac version and saying "eww this looks and runs like arse compared to my friend's PC version".

Lack of GPU is what limits the games the Mac can have, not an over-abundance of drivers.

The thing about Metal is that every driver vendor basically supplies their own version of Metal. There's not one Metal. There are four. (Apple, AMD, Nvidia, Intel.)

It's almost as if the One Metal vision was a stupid idea from the start, and the Mac was at its best when it was the best host for industry standards, rather than a Black-Swan-Fragile example of NIH Island Dwarfism.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
Aren't Radeon drivers for Mac written by Apple mainly? That's why they are not criticized a lot like their Windows counterparts.

They are distributed by Apple. Reportedly there are AMD imbed personnel at Apple (in which case would probably help immensely. **) Metal is a "shorter", "smaller" API stack than OpenGL. But when get down to the binary compilers or bare metal specifics , or more directly firmware specifics then what is "mainly". The more abstract end of the API stack has been Apple. ( Metal makes it so they are the bigger chef in a smaller kitchen). But performance glitches and optimizations don't always have their origins at the top.

I suspect the Mac drivers are much closer to the "Pro" drivers that AMD has than to the "chase every random game possible" general Windows one. Chasing games that have a broad variety of approaches to optimization can be destabilizing. The drivers on XBox and PlayStation 4/5 have problems? Generally no. A very large contributor there is that it is more of a "you fit your game to our fixed hardware target" instead of vice versa ( e.g., we'll warp our drivers to fit romp through rendering technics your game is taking. )

Apple has usually done a subset of GPUs from each AMD generation. They skip features ( or make them uniform by not going so for down the product line up that they get stripped out for segmentation reasons ). Not into allow overclocking or quirky boost mode "features" by board makers.


P.S. (**) When AMD was small , boxed-in player looking to get a steady bill of materials purchase order this made more sense. They willingness to be bossed around by Apple was probably much, much higher. Now that AMD is not so "poor" and Apple is on warpath to evict them from the vast majority (if not all) of the Mac Product line up, that context is likely changing. Who is paying for the imbed workers? With no steady stream bill of material (or at least order or two magnitude smaller contract), it isn't like AMD needs to invest money there as they have few alternatives. Apple pinches pennies enough they probably don't want to pay (at least upfront and directly).
 
Last edited:
  • Like
Reactions: Bustycat

goMac

Contributor
Apr 15, 2004
7,662
1,694
And yet, even with the trashy 3rd party drivers, Apple's GPUs are slower, and more expensive for the delta upgrade prices, than third party GPUs.
As someone who has worked on optimizing Metal code for both AMD and Apple Silicon GPUs - I would disagree. Well optimized, I would put M1 Max somewhere between a desktop Radeon 5700 and Radeon 6800. I haven't worked with M1 Ultra, but M1 Max is already in the right place.

No one cares about dealing with different GPUs and drivers - they already do that, it's a part of the development process. That's literally a core competency for them and their tools.

Dealing with multiple drivers adds cost. You have to at least have test rigs for Intel, Apple, AMD, and Nvidia if you're feeling like going that far. Some Mac games just immediately cut Intel Integrated Graphics as a cost saving measure (for good reason, those drivers are garbage.)

Then you have to take all those devices and test all your different possible shader combinations across all your different possible macOS versions. Thats enough time and cost your already very-thin-margins Mac game may simply become unprofitable.

On Windows, its just done because you can make a lot more money. Yeah, you have to test your game for Radeon 500 series cards. But there are probably more Windows users on a Radeon 500 series card than there are Mac gamers combined. The scale of sales on Windows makes it a lot easier to afford to do that individual config level testing.

So... to make Mac gaming succeed you either have to make Mac games make way more money, or make Mac games cheaper to make. Because no one knows how to make Mac games magically make more money on a Windows scale, the option you're left with is to make Mac games cheaper to produce. And a great way to do that? Narrow the driver targets down to one. That eliminates a testing matrix that could sprawl dozens of configurations down to a few. Developers are happy because they have to do less testing and fixes. Users are happy because it's more likely the game is actually going to work on their Mac.

The situation with graphics drivers on Mac has produced some really weird outcomes. For example - Feral's recent Total War: Warhammer release is an Intel app. But it only runs on Apple Silicon GPUs. That's because they completed the game for Apple's version of Metal, but haven't finished certifying it on AMD's version.

It's almost as if the One Metal vision was a stupid idea from the start, and the Mac was at its best when it was the best host for industry standards, rather than a Black-Swan-Fragile example of NIH Island Dwarfism.

Yeah, this is a bad take.

Let's start with "there is no single standard for graphics in games." We can talk about OpenGL or Vulkan, but those are not really even the industry standard. DirectX is the industry standard. It has the widest install base across Windows and Xbox. The PS5 has it's own proprietary graphics API. So does the Switch. The only place OpenGL and Vulkan are the top tier graphics APIs is on Linux. So simply using Vulkan or OpenGL doesn't really ensure anything. Apple throwing another proprietary API into the mix is relatively meaningless when developers already have to target a half dozen different APIs for wide distribution anyway.

Even when the Mac supported OpenGL - the number of games written in OpenGL was still very few. Most games still had to be ported to OpenGL from DirectX for their Mac versions only.

Second - OpenGL on macOS always had the exact same problems. OpenGL is a spec, it's not a library. Your OpenGL experience is still defined by how well AMD/Nvidia/Intel implemented the OpenGL spec. Thats true on any platform. But on Mac their drivers were not equal and had the same quality issues.

The OpenGL spec is also old and crusty and has a lot of ambiguity. Windows/PS5/Switch/Xbox have not adopted the OpenGL standard spec because it's kind of a mess. Metal at least cleans house and simplifies a lot of things. And we could talk about Vulkan, but no one is really supporting Vulkan either. How many games are actually written in Vulkan? Not many.

Finally - Apple hardware is not designed like traditional GPUs and it does take a bit of elbow grease to get it to sing. Metal has the APIs to do that. OpenGL doesn't. Vulkan adds them but tends to be a bit behind. Metal gives Apple flexibility to do things OpenGL just flat out isn't designed to do, and without having to wait on Vulkan to catch up.
 
Last edited:

fuchsdh

macrumors 68020
Jun 19, 2014
2,020
1,819
Apple could 100% do a turnaround on their games and graphics strategy—allowing eGPUs on AS Macs, supporting OpenGL and Vulkan, offering an affordable xMac, allowing Nvidia GPUs again—and it would make Mac gaming much better for Mac gamers, but it still wouldn't actually make a substantial difference to the Mac gaming market. Apple has inflicted many wounds on itself, but it's also just a minuscule market with needs very different from most of its customers. It's only found success in casual gaming basically by accident and because mobile and casual gaming align much better with their product focus.

I play certain games on my Mac that are supported, and for everything else use an Xbox. It's just a saner strategy.

I think the games question is kind of immaterial to what form the 8,1 takes, though. Even in the Mac Pro's "heyday" it was only ever low single-digit marketshare on Steam.
 

goMac

Contributor
Apr 15, 2004
7,662
1,694
Apple could 100% do a turnaround on their games and graphics strategy—allowing eGPUs on AS Macs, supporting OpenGL and Vulkan, offering an affordable xMac, allowing Nvidia GPUs again—and it would make Mac gaming much better for Mac gamers, but it still wouldn't actually make a substantial difference to the Mac gaming market.

Everyone also grabs onto OpenGL and Vulkan like those are the critical pieces. The Mac has different input APIs. It has different networking APIs. It has different sound APIs. It has different controller APIs. It has a different windowing API. It has different CPU acceleration libraries, even on Intel (no DirectXMath.) And thats also true of every other platform. Heck, you might depend on core simulation or physics libraries that aren't even on the Mac.

OpenGL and Vulkan are not some magic wand that gets waved over Mac gaming that makes everything magically compatible. It's just one piece of a much larger ecosystem of tooling. Vulkan gets way outsized attention because of Proton and hopes that Proton could come to Mac. But that's just running games sub optimally through WINE. That's doesn't have much to do with real ports.
 

kvic

macrumors 6502a
Sep 10, 2015
516
459
  • These days gamer developers grab a game engine (developed by the very few and the best in the industry), and code their games from there. One benefit of a game engine is exactly isolating game developers (mass majority of games dev in this category) and saving their time to deal with different APIs (input, accelartors, sound etc) on different OS. I don't see sharing a common set of APIs helps Apple or as-is hurts Apple.
  • Again. It's not game developers have to test their games rigorously on every platform, Intel, AMD, Apple, Nvidia GPUs for Mac. Not that they don't have to do some tests on GPUs they plan to run their games. The majority of heavy lifting in test effort already done by Game Engine developers which are usually big corps these days with money.
  • Metal akin to DirectX. Do Intel create its DirectX? AMD create another DirectX? Nvidia yet again a different DirectX? No. There is only one DirectX. Sure, Intel/AMD/Nvidia will have to code their drivers to conform to functionality requirements of DirectX APIs. Ensure conformity is Microsoft's job for DirectX. Same goes for Metal. It's not 3rd party GPUs have to code their metal drivers, and create problems for Apple. Those writing GPU drivers in AMD/Nvidia/Intel probably are as smart (or stupid) as their counterparts in Apple.
Apple allows 3rd party GPUs on Apple silicone Macs or not. It's a political/marketing decision. Trying to look at the problem from technical perspective, and justifying Apple's current decision with minute tech details won't work in my opinion.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.