Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
Reading the tea leaves at WWDC this year - there were a few comments that pointed to Apple really not wanting AMD back. Craig Federighi even had an aside at The Talk Show that implied the only reason they were considered Mac games again was because they didn't have to deal with third party GPU drivers or hardware anymore.

"The only reason" is a bit of a stretch. (about 57 minutes in)


Federighi eventually mentioned explicitly in that segment some game engine developer/providers are porting over to the iOS iPhone/iPad platform. If iOS gaming wasn't a large revenue generator over the last 3-4 years I'm skeptical that Apple would be pushing this gaming angle as hard on the Mac this year (at least in this unidimensional GPU hardware fashion). Also, to a lessor extent, if they didn't have a huge sunk cost dumped into the AR/VR headset, there also probably would be a bit less focus. ( some custom GPU software/hardware work will likely eventually leak back over to the Mac side from there (and vice versa). )

There is real revenue uplift potential here is can 'bleed' some of that gaming revenue pool over to the macOS portion of the ecosystem. That doesn't mean that revenue has to have a very high fraction of it come from the most "big budget" gaming titles that are exceptionally 'hard' on GPU resources.

So when he says something to the effect that "All (Apple Silicon) Macs have awesome graphics , capable of playing the best games". The super hardcore gamers are going to balk at that ( cranking things down to 1080p and not having every whiz bang setting set at maximum. ). Gaming Consoles do decent revenue without having the most fire-breathing hot system set ups. I suspect Apple would be happy enough if they pulled something more like a fraction of that kind of revenue on the Mac. They are not out to be penultimate gaming hardware option. Just a good one that is competitive with Consoles (without having to match them on price).

As long as the "GPU consistency" line up has to span all the way down to the iPhones , then that is implicitly the bigger driver pushing the 3rd Party GPUs out. Apple shifting from licensing the GPU ( Imagination Tech) to them building them is partially driving the 'scope creep' here. For Macs overall, the biggest GPU by units was Intel over the last 10 years (or so). Where the Mac Pro could being with cards "sucked in" from the PC gaming market really wasn't the point. It is the bulk of the Mac product range where thermal and space constraints were far more pressing. The primary point was to kill off not just iGPUs but also dGPUs in the laptop space. 70+% of all Macs sold are laptops. So that just sets the whole tone for graphics stack for the platform. ( once throw on top all the desktops that are using laptop SoCs ( Mini , iMac 24" , entry Studio , etc. ) there is relatively little left that isn't Apple GPU based.


Start tossing plain Mx SoC into mid-upper range iPad and the 'blackhole' on graphics policy is even bigger. [ Indeed, at WWDC session on what is new in "drivers" space the primary headline grabber was that the modern driverkit API is portable to the iPads. ( "Mac" drivers on iPad. Can see more clearly why there is now "Graphics" subclass in the driverkit API space now. If portability to iPad is a core requirement, then there are not going to be eGPU or PCI-e slots there. ) . The legacy IOKit had a "graphics" object class. That is not being mapped over. ]


The M1 -> M2 focus gives at higher transistor budget allocation to GPUs . That lowers pressure on provisioning eGPUs for the new laptops (and plain Mx desktops) even more. When plain Mx goes to TSMC N3 ... rinse and repeat. Apple is also playing the "long game" here. It isn't quite where the M-series is now but where it is going to be 2-3 years for the primary focus area for the Mac product line up.



So WWDC 2022 had lots of stuff on how to optimize for Apple GPU. ( Just like would have for a XBox or Playstation developer convention). Lots of "do more with the hardware you got" rather than "port to the latest fire breathing dragon hardware (e.g, a Nvidia developer convention). )



P.S. Where Apple does still appear to be painting themselves into a corner though is on the GPGPU front. For graphics can shave off the upper 10% of the hardware and survive (e.g., XBox/Playstation). For more general GPGPU compute where don't necessarily have to host compute and results display on the same card there is a big gap. Especially once get into the high end single user workstation space.

Apple still playing "catch up" on PyTorch and Ai/ML Training. A compute accelerator doesn't have to have drivers that live down in the "Apple only" kernel space.
 

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
So... to make Mac gaming succeed you either have to make Mac games make way more money, or make Mac games cheaper to make. Because no one knows how to make Mac games magically make more money on a Windows scale, the option you're left with is to make Mac games cheaper to produce. And a great way to do that? Narrow the driver targets down to one. That eliminates a testing matrix that could sprawl dozens of configurations down to a few.

And in doing so, they've made Mac Gaming an inferior gameplay experience to Console gaming. The only point to playing games on a computer, is to get a BETTER graphical experience than you get on Console.

IF you can afford the "better" GPU options, you're not short of cash to buy an XBox or Switch.

We've seen all this before - QuickDraw 3d was allegedly better than other options, Game Sprockets was allegedly better than other options. Until the Mac has the price / performance for game graphics, Fetch ain't happening.

Developers are happy because they have to do less testing and fixes.

Certainly "the hardware is not powerful enough to realise our vision, so we're not supporting the Mac at this stage" is cheaper and less work than actually making a game.

BUT, lets be clear here, every single version of Apple Silicon / Apple Hardware is a different GPU that has to be tested, no differently than testing a different brand of GPU. The sheer numbers of devices that developers have to keep on hand to cover Apple platforms is ridiculous. For Apple Arcade, they have to test every single model of every platform between the AppleTV HD through to the current top of the range Mac Studio.

Users are happy because it's more likely the game is actually going to work on their Mac.

Yes, they're certainly not burdened with the agony of choice, or the pressure to play the games everyone else is playing. Maybe they can enjoy their "new" 5 year old game for the retro appeal.

But yes, it'll "work" as long as Apple doesn't ship the one and only driver as buggy and non-performant... and we know Apple would never do that, right?
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
Federighi is a haircut and a PR face, and The Talk Show is a propaganda outlet, so personally, I wouldn't put any more faith in what's said there, than I would in any other Apple TV-commercial.

While Gruber does toss slow paced, "Softball" , pitches over the plate for Apple to hit at the Talk Show doesn't diminish that some of this stuff is information that Apple wants to get out that doesn't fit in their dog and pony presentations.

In 2020 Federighi said Bootcamp wasn't coming to Apple Silicon. Two years later ... it hasn't. (Despite folks on some of these forum treads trying to twitch an ArsTechinca article into a pretezel to say that it is coming. It is probably not in the future and hasn't so far). Apple is spending more time on virtualization focused solutions ( e.g., Rosetta service inside of a Linux VM, 2D graphics support in Linux VMs , etc. )

Apple leadership team

https://www.apple.com/leadership/

Executive officers are not inconsequential PR props. These folks have real impact on resource allocations (money , personnel , etc) and project prioritization levels.

Similar with Greg Joswiak . He is the head of marketing now. If he thinks that some market segment isn't worth Apple putting into then future products in that segment are in deep trouble. He basically has substantial input into whether something gets 'axed' or not.

Federighi same issue. Zero out any budget allocation for DriverKit to have any 3rd party GPU driver capability and that is basically a dead end segment. ( even more so if no 'get out of jail' free card for weaving into the Apple kernel as pragmatically not a second tier "plug in" ). Apple announced several WWDC ago that kennel extensions are deprecated and will be going away. If no new path pops up for 3rd party graphics drivers to plug into then that segment is done. Mutating the kernel isn't going to work at some point ( so not going to be back door hacks through that vector).


It is of course hilarious that Apple says GPUs and drivers get in the way of games,

Tell that to Intel and their rollout of their brand new discrete Graphics product line. It is a real stumbling block for them that Apple has mostly avoided. ( in part because Apple hasn't changed the memory model/implementation radically ). Apple scope on modification needed to get something working OK is limited. Intel is off chasing every gaming engine quirk out there and it has been a long slough that is still going on.



yet the games that Apple literally commissions to have made as exclusives for their own gaming service, are overwhelmingly developed on Windows PCs with (generally Nvidia) GPUs.

The 3d models used in the games is distinct from the game engines and graphics pipeline optimizations themselves. Saying have an optimized game by not running on the GPU it will be deployed upon isn't close to reality.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
And in doing so, they've made Mac Gaming an inferior gameplay experience to Console gaming. The only point to playing games on a computer, is to get a BETTER graphical experience than you get on Console.

The bulk of the Apple ecosystem gaming is on iPhone/iPad. From that level the M-series SoCs are a step up.

Your perspective is anchored in Apple trying to creating some PC gaming market "killer". They aren't . Far more likely, they are extending the substantively large market they already have more. Similar relationship of XBox -> Windows PC (with AMD) with a different starting point. Apple doesn't have to a "monkey see, monkey do" , feature check box to feature check box copy of Window's gaming to make substantive money.


BUT, lets be clear here, every single version of Apple Silicon / Apple Hardware is a different GPU that has to be tested, no differently than testing a different brand of GPU. The sheer numbers of devices that developers have to keep on hand to cover Apple platforms is ridiculous. For Apple Arcade, they have to test every single model of every platform between the AppleTV HD through to the current top of the range Mac Studio.

And yet at WWDC ...
"....

Scale compute workloads across Apple GPUs​

..."
https://developer.apple.com/videos/play/wwdc2022/10159/

By no means a magic wand, but it is also about actually looking at what Apple is talking about and putting emphasis on at WWDC as opposed a wish list of what Apple's priorities are. Apple is actively rolling out a better development tools to more cost effectively do that testing. That matters.
 
Last edited:

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
Tell that to Intel and their rollout of their brand new discrete Graphics product line. It is a real stumbling block for them that Apple has mostly avoided.

Intel's stumbling block, is they aimed for the knees, when they have to aim for the head. The one lesson everyone should have learned from AMD's last few GPU generations, is the market follows & optimises for grunt. AMD drifted inconsequential with a strategy of "we want to be a slightly cheaper option for the mid range" and it got them nowhere. Likewise in CPUs, AMD was nowhere and glitchy incompatible, until they made processors that were competitive with, and outpaced Intel's. Cheaper for the midrange didn't cut it.

No one is going to invest in Intel's GPU, if Intel can't show the ability to produce a GPU that's MORE powerful than Nvidia's best, because that's the halo that brings the customers. Everyone optimises for the most powerful solution, and it's trickle-down from there.

Which sounds exactly like the situation Apple is in, having to pay organisations to make software for their platforms - be it Blender, or Apple Arcade itself.

It's funny that Microsoft was mocked for the "desperation" of trying to pay developers to make software for the last Windows Mobile version, but when Apple does it...

The 3d models used in the games is distinct from the game engines and graphics pipeline optimizations themselves. Saying have an optimized game by not running on the GPU it will be deployed upon isn't close to reality.

There are in practical terms, no Apple products used anywhere in the production pipeline (usually not even in UI graphics), of games that Apple pays to have made as exclusives for Apple Arcade. From modelling & graphics, through to programming and the eventual build from the compile system - nothing Apple offers has any role in that process.

"Test on Device" is barely used during production, both because of costs of maintaining a testing hardware collection, & it's a clunky process from Unreal, which again, is commonly the dev environment for games Apple pays to have made. The great joke of course, is that for all Apple's antagonism towards Epic, they're reliant on them. Day to Day, the Unreal device simulator is the closest most people come to an Apple device.

Your perspective is anchored in Apple trying to creating some PC gaming market "killer".

My perspective is anchored in talking to people who make games for Apple Arcade.

And yet at WWDC ...

And yet, in practice... there's just as much testing, and version to version glitching, accommodations, Level Of Detail issues, memory budgets etc as there are when working with different GPU brands.

If anything it's worse, because a GPU glitch has to wait on an OS-level patch for a fix.

No one developing games for Apple Platforms uses Apple tools to develop those games. They use Unreal, Unity etc. Just like no one develops AR using ARKit - they use Unity & Unreal, which treat ARKit (& ARCore) as dumb pipes to the hardware.
 

kvic

macrumors 6502a
Sep 10, 2015
516
459
Intel's stumbling block, is they aimed for the knees, when they have to aim for the head. The one lesson everyone should have learned from AMD's last few GPU generations, is the market follows & optimises for grunt. AMD drifted inconsequential with a strategy of "we want to be a slightly cheaper option for the mid range" and it got them nowhere. Likewise in CPUs, AMD was nowhere and glitchy incompatible, until they made processors that were competitive with, and outpaced Intel's. Cheaper for the midrange didn't cut it.

No one is going to invest in Intel's GPU, if Intel can't show the ability to produce a GPU that's MORE powerful than Nvidia's best, because that's the halo that brings the customers. Everyone optimises for the most powerful solution, and it's trickle-down from there.

This is a gd point. It also applies to Apple GPUs. And hence in justifying their goal to compete head on against Nvidia/AMD for the best in the coming few years.

I couldn't foresee Apple will outperform Nvidia/AMD on a single GPU basis. However, if three copies of Apple GPUs combined outperform two copies of Nvidia's best combined, Apple does have a chance. Not a problem for Cheese Grater like Mac Pro chassis. Yet performance per watt outclasses Nvidia/AMD by a comfortable distance. Machine Learning & 3D crowds will flock to the new Mac Pro.

In the long run, extra R&D expenditure justifies the effort or not. Who knows? Apple has to keep developing new GPUs for iDevices, mobile Mac's anyway. So as long as they could glue together smaller GPUs to big ones at die-leve, socket-level, and board level. The game may last for a while.

While Apple silicon Mac sounds exciting today, manufacturing nodes are hitting near limits, Apple silicon design shows some sign of slow-down (e.g. look at M2 vs M1 two years apart). On Mac's in the end, Apple could sacrifice performance per watt for higher performance down the road. Don't lose hope on 3rd party GPUs but the prospect isn't great either.
 

prefuse07

Suspended
Jan 27, 2020
895
1,066
San Francisco, CA
No one developing games for Apple Platforms uses Apple tools to develop those games. They use Unreal, Unity etc. Just like no one develops AR using ARKit - they use Unity & Unreal, which treat ARKit (& ARCore) as dumb pipes to the hardware.

For the record, Unity started as a MacOS game engine, I believe in 2005, which makes Apple's failures (around gaming) even more hilarious.
 
  • Like
Reactions: Basic75

edanuff

macrumors 6502a
Oct 30, 2008
577
258
Super interesting discussion and actually on point. Apple’s GPU strategy is heavily based on how they win with games. Apple makes something like $20bn a year from App Store games, that makes it comparable to Microsoft’s game division revenues. Going all in on Apple Silicon GPUs which trickle down to the device line is their way of putting all the wood behind the arrow in a strategy that ensures that every logic gate and every line of code results in a unified investment from the desktop to the pocket (with AR/VR the next step). Now, just to anticipate the rebuttals, those aren’t the types of AAA games that I play either which is why I also have a Windows gaming PC with a 3080 Ti in it, because I know that Apple’s strategy is structurally sound and going to work for them and they’re unlikely to change it even if it means not serving certain markets.
 

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
I know that Apple’s strategy is structurally sound and going to work for them and they’re unlikely to change it even if it means not serving certain markets.

the nanosecond Apple's app store monopoly is declared invalid, and that is basically inevitable at this stage, all of that games revenue vanishes, except for Apple Arcade revenue. So, what motivates them to put money into games-graphics after that?
 

goMac

Contributor
Apr 15, 2004
7,662
1,694
Apple allows 3rd party GPUs on Apple silicone Macs or not. It's a political/marketing decision. Trying to look at the problem from technical perspective, and justifying Apple's current decision with minute tech details won't work in my opinion.

There is actually a strong technical reason.

Apple GPUs at a hardware level doesn't work like AMD and Nvidia GPUs. The optimization patterns are different. Metal has an entire subset of API that is _only_ for Apple GPUs because they're related to hardware optimizations only present on Apple GPUs. Even basic stuff like texture storage follows a different path on Apple Silicon GPUs that needs to be if'd in the code.

So it's not just a matter of Metal is all the same and it doesn't matter what GPU is active. Apple Silicon Metal is actually different at an API level than Nvidia or AMD Metal. Code written for an Apple Silicon GPUs will not run on an AMD or Nvidia GPU. Period. If you want to support both you actually have to do a bunch of GPU capability checks and then decide on a render path. It is likely that the games that Apple announced at WWDC, if they are Apple Silicon only, are incompatible with AMD and Nvidia GPUs at a deeper level.

Thats also why benchmarks are so goofy, and I wouldn't necessarily trust them this early. Metal code written for AMD and Nvidia GPUs will run on Apple Silicon. But sub optimally.

This is also why I think WWDC continues to generally hint that Radeon and GeForce support is dead. All the sessions are about writing Metal code that's mostly incompatible with Radeons. And while there is acknowledgement of Radeons in stuff like MetalFX, there is little to no focus of writing code that is optimal for AMD GPUs. Only optimizations for Apple GPUs are being covered.

This also means that AMD and Nvidia support relies on developers continuing to write Metal code that is compatible with AMD and Nvidia GPUs. As soon as Apple Silicon becomes the only supported platform for Metal developers - they'll stop writing that code, and write Metal code that only will run at an API level on Apple Silicon. And as soon as they do that Apple actually can't bring back AMD and Nvidia support.
 
Last edited:

prefuse07

Suspended
Jan 27, 2020
895
1,066
San Francisco, CA
the nanosecond Apple's app store monopoly is declared invalid, and that is basically inevitable at this stage, all of that games revenue vanishes, except for Apple Arcade revenue. So, what motivates them to put money into games-graphics after that?

Aren't they currently in talks to acquire EA?
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,422
Aren't they currently in talks to acquire EA?
Not likely, EA is probably shopping around for buyers and approached Apple. Who in all likelihood passed because it’d be a huge waste of time and money.

the nanosecond Apple's app store monopoly is declared invalid, and that is basically inevitable at this stage
Lmao, if anything it’s more likely they’ll have to allow 3rd party payment processors if anything. They’d basically rewrite all monopoly decisions if the gov decided that having a store monopoly over ones own device is illegal. (Not that reality ever gets in the way of wishful thinking).

Likewise, Apple is barely over 50% of the smartphone market in the United States (its largest market), and less than that in any other country. That’s a stretch to call a “monopoly”

This is compounded by the fact that their main competitor platform is completely open. Consumers have a choice for an open platform!

all of that games revenue vanishes, except for Apple Arcade revenue. So, what motivates them to put money into games-graphics after that?
Same reason other smartphone manufacturers keep making more powerful gpus. It’s not like games will disappear from the platform even if your (completely disconnected from reality) scenario happens.

Consider this:

Technological progress doesn’t revolve around games.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
Aren't they currently in talks to acquire EA?

talks doesn’t mean serious talks. lots of folks talk to Apple because it looks like they have giant pile of cash they dont have any purpose for. That somewhat overlooks how much they have borrowed to duck paying taxes ( 100s billions ) and the blackhole Car project. Apple is not broke by any means , but that cash stockpile isn’t exact free from leverage.

Never the less. SoftBank tried to sell Arm to them , etc, etc, etc. most of this is Apple talking to folks to keep communication channels open and also get info . Also lowers numbers of suits by investors who say there was better allocation for the money ( looked for something to buy but didn’t make sense ) .

EA does not hae that many folks they can sell themsleves to at a greatly inflated price. Apple is not on list because it makes lots of sense but more so because EA does not have lots of choice .
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
For the record, Unity started as a MacOS game engine, I believe in 2005, which makes Apple's failures (around gaming) even more hilarious.

Actually more hilarious as to the giant holes in the "go shot the top end highest target in the head" as the only path to success.

"..
Unity’s growth is a case study of Clayton Christensen’s theory of disruptive innovation. While other game engines targeted the big AAA game makers at the top of the console and PC markets, Unity went after independent developers with a less robust product that was better suited to their needs and budget.

As it gained popularity, the company captured growth in frontier market segments and also expanded upmarket to meet the needs of higher-performance game makers. Today, it’s making a push to become the top engine for building anything in interactive 3D. ...
... s story began on an OpenGL forum in May 2002, where Francis posted a call for collaborators on an open source shader-compiler (graphics tool) for the niche population of Mac-based game developers like himself
... "
https://techcrunch.com/2019/10/17/how-unity-built-the-worlds-most-popular-game-engine/



so the whole "nobody can get ahead in gaming world if focus on OpenGL" ... not. ( make something work and then expand )
The whole "have to tackle biggest , 'baddest' product first " ... not ( skipped AAA focus initially )
The "Halo effect is *everything* , halo product or bust" ... not ( make a good tool for their initial target audience and grew )


Starting off on Macs and OpenGL gave Unity space to organically grow in a space that would super hyper competitive. They built some tools that worked and then got better at it.

There is zero way that Unity was going to pull Mac gaming forward by themselves or that they could have pulled Nvidia through the pissing match they started with Apple. At one point Unity had Blackberry , Android , and iOS back end targets (in addition to Windows) . They shifted to being multiple platform ( and they chased some large growth platforms .. and won.) . If Apple had tried to make them macOS exclusive then outcomes would have been different.


Microsoft Word and Excel started off on the Mac too. That wasn't their only end goal. It is was a profitable enough starting point to grown on.
 
  • Like
Reactions: ZombiePhysicist

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
Aren't they currently in talks to acquire EA?
The big obstacles to that are antitrust - a lot of regulators world-wide could object, and the funny thing is Apple is just as much at the mercy of European regulators, as they are of American. Apple is a different order of magnitude now, to when it acquired / was taken over by Beats - what would have passed then, might not pass now.

The other thing is Disney would probably pull their licences from an Apple-Owned EA, start up their own game studio, and poach all the key people who worked on Star Wars games. And, they'd get them too, because the dirty secret of gamesdev is that people don't like working on mobile games. Give a gamedev person a choice between PC / Console and mobile (which is what Mac Gaming is going to be), they'll choose console / PC. There's money in mobile, especially if you get on the Apple Arcade gravy train - it's well paid, predictable work, but it's not particularly creatively satisfying to sit so far behind the cutting edge. Disney offers a blank cheque to the folks doing Star Wars games, they'll get them, and talent acquisition / retention is one of the big problems in games now - especially given how union-hostile Apple is, and the growing unionisation movement within gamedev.
 
  • Like
Reactions: prefuse07

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
Lmao, if anything it’s more likely they’ll have to allow 3rd party payment processors if anything.

3rd party payment processors, sure, but the way any judgement is likey to come down on that, is Apple will be unable to charge more than single digits for their store service (given Microsoft set an example for the Windows store of 0% if using external payment processor), so the games revenue is gone all the same. Their pissing about with a 3% discount for Dutch developers who use 3rd party payment processing, is going to blow up in their face. AND, it's looking pretty likely that the EU will mandate side-loading, so again, alt stores are going to become a thing.

They’d basically rewrite all monopoly decisions if the gov decided that having a store monopoly over ones own device is illegal. (Not that reality ever gets in the way of wishful thinking).

A rewriting of monopoly laws is coming, whether American companies, and the American legal system like it or not. Its on the way, and it's going to radically rewrite the fundamental underpinnings of Apple's business model. That's one reason why Apple is diversifying.

*edit* To be clear, the future of competition law is likey to be regulation, based upon vendor / customer power in isolation, not cross-market monopoly power. John Deere won't be able to say "you can buy a different brand of tractor if you don't like our terms", they'll be regulated based upon the power they hold over customers *during* the vendor / customer relationship.

Likewise, Apple is barely over 50% of the smartphone market in the United States (its largest market), and less than that in any other country. That’s a stretch to call a “monopoly”

As has been stated a zillion times by competition regulators, the market is not defined as "Smartphones", it is defined by platform. iOS is a market, Android is a market, and Apple and Google are going to be regulated based upon the power they hold within those markets.

This is compounded by the fact that their main competitor platform is completely open. Consumers have a choice for an open platform!

That's not an argument swaying regulators so far. Apple tried it in the EU over the common charging plug directive, and it failed.

Technological progress doesn’t revolve around games.

Actually, in computing, it kind of does. Games tech is at the cutting edge of most things, and Games engines are eating the world when it comes to any industry where 3d content is involved.
 
Last edited:
  • Like
Reactions: iPadified

JMacHack

Suspended
Mar 16, 2017
1,965
2,422
3rd party payment processors, sure, but the way any judgement is likey to come down on that, is Apple will be unable to charge more than single digits for their store service (given Microsoft set an example for the Windows store of 0% if using external payment processor), so the games revenue is gone all the same. Their pissing about with a 3% discount for Dutch developers who use 3rd party payment processing, is going to blow up in their face. AND, it's looking pretty likely that the EU will mandate side-loading, so again, alt stores are going to become a thing.
Yeah I wouldn’t hold my breath.
A rewriting of monopoly laws is coming, whether American companies, and the American legal system like it or not.
Wishful thinking by people who have zero understanding of how things work in reality.
Its on the way, and it's going to radically rewrite the fundamental underpinnings of Apple's business model. That's one reason why Apple is diversifying.
Have you ever been in a business before? Diversification is something all businesses do. As explained to me by a COO friend, a business has to expand and diversify otherwise it declines.
*edit* To be clear, the future of competition law is likey to be regulation, based upon vendor / customer power in isolation, not cross-market monopoly power. John Deere won't be able to say "you can buy a different brand of tractor if you don't like our terms", they'll be regulated based upon the power they hold over customers *during* the vendor / customer relationship.
You’re conflating “right to repair” with platform gatekeeping. The controversy in “Right to Repair” is access to first party repair information and parts, which are kept away from third parties.

Which has nothing to do with the App Store.
As has been stated a zillion times by competition regulators,
By lawyers on one side, in the case of Epic as I’ve been following.
the market is not defined as "Smartphones", it is defined by platform. iOS is a market, Android is a market, and Apple and Google are going to be regulated based upon the power they hold within those markets.
Here’s where that goes off the rails: monopoly law in the U.S. is typically based on harm to the end consumer.

It is an uphill battle to say the least to prove harm to end consumers based on App Store monopolies on a single platform, which has reasonable competition (which only doesn’t exist if you narrowly and arbitrarily define the market as a “platform”)

In fact, the success of the iPhone against the more open platform can be an argument of consumer preference in itself. (And no, calling iPhone users “brainwashed sheeple” won’t hold up in court)

If Apple users were harmed by using the App Store, wouldn’t it follow that they’d switch to the alternative? Or at the very least refuse to upgrade to the newest iOS in order to jailbreak easier?

And if the harm is done by pricing as claimed, the path of least resistance would be to force the use of third party payments at no charge.

Hence, the most likely outcome being third party payments.

That's not an argument swaying regulators so far. Apple tried it in the EU over the common charging plug directive, and it failed.
The EU was hell bent on that as an environmental agenda. Having a common charger would reduce e-waste was the reason they constantly brought up.

Which is a totally irrelevant argument when it comes to software.

I’m no expert in how the EU functions legally, but I imagine that they’d also have similar problems as the US.
Actually, in computing, it kind of does. Games tech is at the cutting edge of most things, and Games engines are eating the world when it comes to any industry where 3d content is involved.
Most 3D artists I know work in Blender after dropping SFM. If you’re speaking about using Unreal 5 to set up scenery you’re sadly missing the bigger picture in that it’s one tool of many.

And it doesn’t involve playing games on it.
 

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
Yeah I wouldn’t hold my breath.

Well I guess we'll have to see - my bet is within 5-10 years, Apple in much of the world will be prohibited from tying its payment processing business to its app / media store business, and also prohibited from tying its app store business to its device business.

Sideloading is more or less a guaranteed outcome of that, because Apple is using its platform control to achieve anti-competitive outcomes. They are a convicted Antitrust violator in America. The clear overwhelming evidence of the App Store is that every claim Apple makes about safety of their sole App Store, is falsifiable by the sheer number of scam and fraudulent apps on Apple's "curated" store.

Basically, Apple will be functionally prohibited from compelling developers to pay them anything, in order to sell software to consumers. They may begin charging for their development tools, but 3rd party development tools will be available. I would not be surprised to see them lose the sole notarisation authority, or at least the ability to charge on anything other than a FRAND basis.

Contrary to your assertions, it is competition regulators internationally who are framing iOS and Android as separate markets, rather than alternatives within a "Smartphone" market.

You’re conflating “right to repair” with platform gatekeeping. The controversy in “Right to Repair” is access to first party repair information and parts, which are kept away from third parties.

They're both examples of the same argument - "If you don't like our terms, you can buy our competitor's product, which may offer what you want". It's a facile argument that discounts the power imbalance inherent in non-transferrable investments in the lockin aspects of a platform, be it for capital investment in tractors, or software and media libraries.

When you have hundreds of dollars in Apple Books library, an Android phone is not a competitive alternative to an iPhone, because the content library is not transferrable. Apple could offer Apple Books on Andoid, and defuse that issue, but they likely won't, because the content is a part of their lockin strategy.

Lockin in general is being critically examined if you listen to what regulators are saying.

Here’s where that goes off the rails: monopoly law in the U.S. is typically based on harm to the end consumer.

American Law will not be the determiner of this. It will be the EU, Asia, even Australia etc who set the terms for Apple's future.

Consumer harm is not the only issue at hand, 3rd party developers are Apple's customers as well, and Apple's treatment of them is very much a topic of concern for regulators. To take eBooks as an example - if an author sells on Apple Books, Apple takes 30%. If they decide to sell on a 3rd party book app, Apple still takes 30%, BEFORE the book App takes its percentage to cover its costs, leaving the author with ~40% of cover price. So right there, Apple is using its control over the platform to advantage its own book store, to the detriment of competing book stores. Worse still, Apple can simply refuse to allow the competing book store app to exist at all, and maybe do it on a whim, after the app has been developed (which was literally revealed as their strategic plan during the Epic discovery - "iBooks will be the sole bookstore on the iPad").

This is one of the reasons the EU has imposed binding third party adjudication over Apple's App Store decisions.


If Apple users were harmed by using the App Store, wouldn’t it follow that they’d switch to the alternative? Or at the very least refuse to upgrade to the newest iOS in order to jailbreak easier?

The sunk cost of Apple-exclusive content, and its lockin effect is a consideration for regulators, which counts against the idea that Android is an equivalent competitive option to iOS. That's the great joke, IF Apple allowed sideloading and alternative stores, Android WOULD likely constitute an alternative within a single Smartphone market. Everything apple has done to create unique lockin, has effectively sabotaged their argument than consumers can just "switch to Android".

Just goes to show that Evil contains the seeds of its own destruction.

I don't know if you've tried to keep an iOS device on an old OS version, but it's next to impossible - if your device is capable of running a newer OS, you don't get security updates, even though devices which max out on the version you're using do.

For example, the last version of iOS 12 an iOS 15-compatible iPhone 6S can run, is 12.4.1. The current security updated version of iOS 12 for devices that max out on iOS 12, is 12.5.5.

Most 3D artists I know work in Blender after dropping SFM.

MAYA & ZBrush is still the primary toolchain within professional studios. Blender is what kids coming out of games degrees are learning, but companies are built upon MAYA pipelines.

If you’re speaking about using Unreal 5 to set up scenery you’re sadly missing the bigger picture in that it’s one tool of many.

Unreal for dynamic rear screen projection, & VR tracking of cameras is now the ascendent shooting and camera control environment for on-set production. Whether people are playing games in it or not isn't the issue, Games Engines which leverage GPU power are likewise the ascendent Generic 3D environment to host any application which deals with volumetric media or tasks, be it CAD, ArchViz, Games.
 
Last edited:

JayKay514

macrumors regular
Feb 28, 2014
179
159
And in doing so, they've made Mac Gaming an inferior gameplay experience to Console gaming. The only point to playing games on a computer, is to get a BETTER graphical experience than you get on Console.
Uh.... The point to playing games on a computer is to play a game on your computer.

There are billions of people who play games on their computers that are not AAA, push-the-limit-of-the-specs, 200fps 4k team shooters with ray tracing where, as comedian Kumail Nanjiani said, "you can see people's shoelaces bounce... but they couldn't be bothered to Google what language they speak in Pakistan."

It's obvious that Mac gaming has been a niche market for decades (I used to work in computer retail and our Mac gaming section was always pitiful). But to be fair, high-end PC gaming is also a niche. Moderate-power PCs run most games fine.

In fact, certain high-power gaming PCs cannot be purchased online because of their drain on the electric grid (and subsequent effect on CO2 emissions from power plants).

It could be partly this reason why Apple is moving to the "power per watt" / SoC model and the idea of the computer as a console, of a sort.

...Back to the topic of the AS Mac Pro, it's not going to be a general purpose PC. It's their flagship high-end workstation aimed at animators, designers, video editors, coding, scientific computing, architecture / CAD, etc.

Do those customers care about having the latest and greatest graphics card, or do they care about what the machine can do for them?

While we've already seen that the M1 Ultra can match or beat the current Xeon W 22 core, the AS Mac Pro has to at least match the 7,1 in:

  • graphics performance from the MPX modules (maybe as high as the 6900x)
  • Afterburner video stream codec acceleration
  • system RAM (up to 1.5-2TB)
  • onboard expandable storage
The M1 Ultra's graphics are, reportedly, 80% faster than the W6900X, the $7,500 CAD top-end MPX module. So I'd expect a next-generation chip to be even faster. So, check. ✅

ProRes acceleration is already built in. The M1 Ultra can run massive numbers of 4k & 8k video streams. Check .✅

Massively expandable system RAM might require innovation here. I don't know if they would add some sort of bridge to traditional DIMMs, if it would slow down CPU-memory bandwidth.

Short of massively increasing the size of the main SOC with more RAM chiplets, maybe what they'll have is expansion based on the high-speed die interconnect they use on the M1 Ultra; basically, one or two "RAM SoCs" socketed right next to the CPU with direct lines. Yeah, it'd be custom proprietary RAM from Apple, and it likely won't be cheap. Might even require its own heatsink?

I would hope they would offer four standard PCIe NVMe M.2 slots on the motherboard for storage; then again they might go for those custom SSDs as in the current Mac Pro (hopefully not, but it's Apple.).

More of a question is: would they include standard PCIe slots? I'm guessing maybe few to none. Certainly not the 8 slots as on the 7,1.

One of the reasons I was considering building a PC was because I was interested in using Universal Audio's DSP cards to run their plugins for audio production. That said, the DSP chips UA still uses are really, really old. Modern CPUs can run native plugins just as fast, and new architectures like the M series can run them even faster.

If you recall when the 2019 Mac Pro was first demoed, they showed Logic running with 1000 tracks. There are videos showing the M1 Mini running almost that many tracks (with relatively CPU hungry plugins, too.). So likely no need for slots for DSP cards.

If the aim is to be less power-hungry and the onboard graphics are fast enough, then there's no need for MPX modules or traditional slots, plus a beefy power supply with 6-pin extra PCIe power cables for graphics cards.

I can see a use for PCIe cards to add additional SSDs, USB and Thunderbolt connections, and some niche interface cards for external equipment like broadcast gear, specialty storage devices or high-speed networking. But nothing that requires massive amounts of electricity. Maybe a couple of 16x slots and a couple of 4x slots, because at PCIe 5 speeds even 4x is pretty fast.

So I'm guessing the 8,1 will look like a downsized 7,1.
 
Last edited:
  • Like
Reactions: Nugget and JMacHack

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
While we've already seen that the M1 Ultra can match or beat the current Xeon W 22 core, the AS Mac Pro has to at least match the 7,1 in:

In certain circumstances. We've also seen how it gets absolutely trashed in sustained heavy loads, when compared to Xeon systems.

  • graphics performance from the MPX modules (maybe as high as the 6900x)

No, it has to match the performance of whatever has superceded the 6900, the 6800 duo, and preferably whatever NVidia has on the market by that stage.

  • Afterburner video stream codec acceleration
  • system RAM (up to 1.5-2TB)
  • onboard expandable storage

No M1 has achieved these things so far, and it would require, what, 15 M1 Ultras to get to 2tb, so say goodnight to power consumption or price... 128gb M1 Ultra is what, $6k? so there's $90k for 2tb of 128gb M1s.

The M1 Ultra's graphics are, reportedly, 80% faster than the W6900X, the $7,500 CAD top-end MPX module. So I'd expect a next-generation chip to be even faster. So, check. ✅

That's absolutely not true. Go look around here at the Metal results people are posting for cMP systems - the M1 Ultra delivers maybe 2/3rd the power of a 6800xt - the M1's supposed Graphics performance is on very, very specific tasks for which it has specific hardware acceleration eg Apple ProRes.


ProRes acceleration is already built in. The M1 Ultra can run massive numbers of 4k & 8k video streams. Check .✅

But can't output an 8k display.


So I'm guessing the 8,1 will look like a downsized 7,1.

It'll be priced like an up-sized 7,1, I think is likely the only thing one can easily bet upon.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,422
Uh.... The point to playing games on a computer is to play a game on your computer.

There are billions of people who play games on their computers that are not AAA, push-the-limit-of-the-specs, 200fps 4k team shooters with ray tracing where, as comedian Kumail Nanjiani said, "you can see people's shoelaces bounce... but they couldn't be bothered to Google what language they speak in Pakistan."

It's obvious that Mac gaming has been a niche market for decades (I used to work in computer retail and our Mac gaming section was always pitiful). But to be fair, high-end PC gaming is also a niche. Moderate-power PCs run most games fine.

In fact, certain high-power gaming PCs cannot be purchased online because of their drain on the electric grid (and subsequent effect on CO2 emissions from power plants).

It could be partly this reason why Apple is moving to the "power per watt" / SoC model and the idea of the computer as a console, of a sort.

...Back to the topic of the AS Mac Pro, it's not going to be a general purpose PC. It's their flagship high-end workstation aimed at animators, designers, video editors, coding, scientific computing, architecture / CAD, etc.

Do those customers care about having the latest and greatest graphics card, or do they care about what the machine can do for them?

While we've already seen that the M1 Ultra can match or beat the current Xeon W 22 core, the AS Mac Pro has to at least match the 7,1 in:

  • graphics performance from the MPX modules (maybe as high as the 6900x)
  • Afterburner video stream codec acceleration
  • system RAM (up to 1.5-2TB)
  • onboard expandable storage
The M1 Ultra's graphics are, reportedly, 80% faster than the W6900X, the $7,500 CAD top-end MPX module. So I'd expect a next-generation chip to be even faster. So, check. ✅

ProRes acceleration is already built in. The M1 Ultra can run massive numbers of 4k & 8k video streams. Check .✅

Massively expandable system RAM might require innovation here. I don't know if they would add some sort of bridge to traditional DIMMs, if it would slow down CPU-memory bandwidth.

Short of massively increasing the size of the main SOC with more RAM chiplets, maybe what they'll have is expansion based on the high-speed die interconnect they use on the M1 Ultra; basically, one or two "RAM SoCs" socketed right next to the CPU with direct lines. Yeah, it'd be custom proprietary RAM from Apple, and it likely won't be cheap. Might even require its own heatsink?

I would hope they would offer four standard PCIe NVMe M.2 slots on the motherboard for storage; then again they might go for those custom SSDs as in the current Mac Pro (hopefully not, but it's Apple.).

More of a question is: would they include standard PCIe slots? I'm guessing maybe few to none. Certainly not the 8 slots as on the 7,1.

One of the reasons I was considering building a PC was because I was interested in using Universal Audio's DSP cards to run their plugins for audio production. That said, the DSP chips UA still uses are really, really old. Modern CPUs can run native plugins just as fast, and new architectures like the M series can run them even faster.

If you recall when the 2019 Mac Pro was first demoed, they showed Logic running with 1000 tracks. There are videos showing the M1 Mini running almost that many tracks (with relatively CPU hungry plugins, too.). So likely no need for slots for DSP cards.

If the aim is to be less power-hungry and the onboard graphics are fast enough, then there's no need for MPX modules or traditional slots, plus a beefy power supply with 6-pin extra PCIe power cables for graphics cards.

I can see a use for PCIe cards to add additional SSDs, USB and Thunderbolt connections, and some niche interface cards for external equipment like broadcast gear, specialty storage devices or high-speed networking. But nothing that requires massive amounts of electricity. Maybe a couple of 16x slots and a couple of 4x slots, because at PCIe 5 speeds even 4x is pretty fast.

So I'm guessing the 8,1 will look like a downsized 7,1.
You’re missing the main argument here, which is “MUH GAMES”.

If the computer doesn’t play “MUH GAMES” better than every other computer then it is ****.

And if the best way to play “MUH GAMES” is to buy an NVidia 6090 for $3000 then by god Apple better provide it or they’ll be doomed.

Because “MUH GAMES” drives any and all technology, is the biggest and most powerful market in the world by far, and anything that doesn’t do it is clearly inferior.
 

JayKay514

macrumors regular
Feb 28, 2014
179
159
In certain circumstances. We've also seen how it gets absolutely trashed in sustained heavy loads, when compared to Xeon systems.



No, it has to match the performance of whatever has superceded the 6900, the 6800 duo, and preferably whatever NVidia has on the market by that stage.



No M1 has achieved these things so far, and it would require, what, 15 M1 Ultras to get to 2tb, so say goodnight to power consumption or price... 128gb M1 Ultra is what, $6k? so there's $90k for 2tb of 128gb M1s.



That's absolutely not true. Go look around here at the Metal results people are posting for cMP systems - the M1 Ultra delivers maybe 2/3rd the power of a 6800xt - the M1's supposed Graphics performance is on very, very specific tasks for which it has specific hardware acceleration eg Apple ProRes.




But can't output an 8k display.




It'll be priced like an up-sized 7,1, I think is likely the only thing one can easily bet upon.
Hmm. All good points.

That said, seeing how well the M1 Ultra performs leads me to believe that they *could* meet those targets if they wanted to. The fact that their SoC is 2/3 of a 6800 is still pretty impressive, so it's not impossible for them to scale that up. Could also be that this is a constraint of the Mac Studio's form factor (power supply & cooling). To get that kind of power in a small form factor *is* pretty neat.

It may well be that the 8,1 will use a hybrid SoC CPU with standard PCIe lanes and memory controllers, for expandable ECC RAM and PCIe graphics.

Maybe a 2-tier approach with simultaneous onboard graphics + RAM, and outboard RAM + PCIe graphics. There are probably use cases where people need multithreaded CPU power more than big displays, so an onboard-graphics-only base model makes sense.
 

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
That said, seeing how well the M1 Ultra performs leads me to believe that they *could* meet those targets if they wanted to. The fact that their SoC is 2/3 of a 6800 is still pretty impressive, so it's not impossible for them to scale that up. Could also be that this is a constraint of the Mac Studio's form factor (power supply & cooling). To get that kind of power in a small form factor *is* pretty neat.

I guess the real question is, "what is this getting for consumers?"

It's not getting us cheaper computers, it's not getting us more performant computers, and it's arguable as to whether in the next couple of years, the energy consumption difference will be measurable, or even important given the relative insignificance of a domestic computer's draw compared to most other household appliances, and the absolute irrelevance of a commercial / industrial computer's power draw. The lifetime energy savings of a new Mac Studio over an existing Mac Pro are probably nullified by the energy cost in making the Mac Studio in the first place.

It may well be that the 8,1 will use a hybrid SoC CPU with standard PCIe lanes and memory controllers, for expandable ECC RAM and PCIe graphics.

Maybe a 2-tier approach with simultaneous onboard graphics + RAM, and outboard RAM + PCIe graphics. There are probably use cases where people need multithreaded CPU power more than big displays, so an onboard-graphics-only base model makes sense.

Sure, but given card-based graphics outperforms on-die graphics, and costs less than on-die graphics, why not just use a lower-end card-based gpu and a big processor for a unit that doesn't need big graphics, and perhaps vice-versa?

To look at the evidence, it's pretty clear Apple wants all Macs to be like the iPhone and iPad - what you buy initially is what you get, and it's all you ever have, until you buy a new one, and the next stop, hardware as a subscription service.
 
  • Like
Reactions: OS6-OSX
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.