Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

spaz8

macrumors 6502
Mar 3, 2007
492
91
The unified memory on the AS Mac's is interesting.. but I'm not sure how many ppl are training even 50+ GB ML models.. that don't have a company or institution bankrolling them to buy NV GPUs that might train them in 1/10th the time of an M1 Ultra. Again why I think the GPU performance of the AS MP is the most interesting unknown.. more so than even the AS MP being 2x+ faster that the Mac Studio w/Ultra. - which might be almost literally 2 mac studios bolted together.
 
Last edited:
  • Like
Reactions: iPadified

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
Big models that will tank on GPUs without sufficient VRAM.

these workloads exist

Do you think that's a larger addressable market, than people who simply need a GPU that is competitive on overall performance, and can be upgraded during the working life of the computer?

If that workload exists, it's surely already being addressed with standard GPUs.

Workloads expand to fill resources, so it's not like Apple can field a machine with 500GB of VRAM, and say "we've solved the problem", because the next day, someone will load up a model that fills that space... and then what?
 
  • Like
Reactions: AAPLGeek

throAU

macrumors G3
Feb 13, 2012
8,944
7,103
Perth, Western Australia
Do you think that's a larger addressable market, than people who simply need a GPU that is competitive on overall performance, and can be upgraded during the working life of the computer?

If that workload exists, it's surely already being addressed with standard GPUs.

Workloads expand to fill resources, so it's not like Apple can field a machine with 500GB of VRAM, and say "we've solved the problem", because the next day, someone will load up a model that fills that space... and then what?

Not sure Apple cares. They’re building the Mac Pro for the workloads people actually use it for, not pc workloads and gaming.

And no, the workload types that exist suited to this unified architecture are not being addressed by discrete GPUs due to the limited vram size.

That’s the entire point and why Apple are building machines this way.

So all the ram can be accessed by all the different processor types in the device, not limited to smaller discrete islands of memory with limited bandwidth between them.

A discrete CPU/GPU would not be able to leverage the ml (or other) cores on the same data without shuffling it over a bus, wasting processing time and and power efficiency.

And if you think power efficiency doesn’t matter and is only for laptops - guess again. 1000 watts doing work more efficiently is still a thing. There are power and cooling limits even on workstations.
 
Last edited:

Matty_TypeR

macrumors 6502a
Oct 1, 2016
638
548
UK
No matter how you dress apple up to be a saviour to the work place or work loads, its still a closed system in the apple M chip ecosphere that apple hardware is best and all you will be able to use, like it or not. Yet there own API metal is thrashed by other GPU's even though apple promote the new metal as groundbreaking for games they also wish to control on apple devise's.

VM machines are not as reliable as true booting into an OS simple as that, having run parallel windows i know the difference when talking direct to plate setter rips for fine art printing at 220 dpi with high res files. VM is ok but you cant beat the real thing.

We shall see how well apple do when compared to dual 7900xt cards with 24g Vram each and much faster or if apple even allow the 7900 series to be supported as i cant see them allowing dual 7900 cards X2 with 96g of Vram as they would destroy the M2 series. And then Apple would have to release AMD drivers that are on PAR with windows performance which they are far from with the current 6000 series.

Don't be fooled by Apple supposed we are the best as they even slowed down there own tech older iphones so people would buy new iphones, found out they were. So apple is not honest with everything and if you want to buy into apple M chip tech be aware no upgrades, and after 3 years and no apple care any failure of any part and its bin time. No matter how power efficient the cost of repair will be close to new if apple even have parts available to fix it.

And the biggest point of all, NO upgrade path at all, if you wish to upgrade a new machine will be your only option. So hardly a green stance apart from power efficiency.

It dosn't bother me that my GPU might be 30 sec's slower in some tasks, but at least after 18 months i could upgrade it for the latest tech with out upgrading the whole machine.

The M chips are good in laptop's, Mac mini's even the Big mac mini the studio or Imac where discrete on chip GPU's are suited, but not in a pro machine. Upgrade paths are essential.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
Any new Mac pro with any M chip is not a true Mac pro.

Mac pro stands and always has for upgradability.

CPU Socket Upgradable.
GPU via Pcie Upgradable.

Does Apple encourage or endorse CPU upgrades in the Mac Pro 2019 (or 2013 ) ? No. Not particularly on the 2009-2012 models either. It is there more so as an artifact of Intel's design decisions more so that high priority Apple objectives. To claim that this is an Apple "Mac" driven property is likely overreach. It is a side effect of using what Apple had available to buy off the shelf.

( e.g., The Mini 2018 uses BGA Core i5 , i7 packages and Apple was 100% fine with that. )

Intel/AMD have to sell some desktop/workstation CPU packages to multiple system vendors who use multiple motherboard developers. There are also multiple DIY motherboard vendors. In the x86_64 laptop space there are not multiple DIY motherboard vendors and the laptop product is mainly custom/semi-custom boards with BGA packages.

Apple is going to make socketed Apple only SoCs for who else????? What other boards are going to take these SoCs? None. So commodity part why? Socketed because it is gotta be socketed is primarily just form over function. Intel/AMD sell into a commodity market where the socket serves a function for that commodity market. There is no commodity market here in Mac Pro space. So it is 'form' in search of a function.



The Mac Pro 2009-12 models had a CPU tray to accommodate doing single and dual CPU package variants of the Mac Pro with a single base chassis. MP 2013 also used a proprietary connector to put subsystems on secondary cards. There is a path where Apple puts the main SoC and the required close proximity chips all on one card. It would pragmatically be a proprietary socket for a non commodity component. Pretty good chance this also will heard the solution into a 1 slot wonder status as running more than one (or two) x16 PCI-e standard connections off the board will get expensive ( have other USB4 and Thunderbolt4 to run off also if there are 6-8 ports on the system/ And all the power input it will likely need through the socket. )




Sound card via Pcie Upgradable.

Chuckle. Even the Intel 'Server' chipset in the Mac Pro 2019

" ... Intel® HD Audio Technology Yes ... "
https://ark.intel.com/content/www/us/en/ark/products/97338/intel-c621-chipset.html"


That basic sound isn't going to be standard integrated stuff in 2022 and forward is kind of odd. Intel has standardized integrating sound in the basic chipset for more than several years now. AMD may be lagging far behind the trend curve, but railing against integrated sound is pretty much a luddite move. That change vector has already started well before Apple released the M-series.

[ AMD Ryzen 7000 desktop packages are geeting a basic iGPU and sound now that their I/O chipset transistor budget is much higher. This can easily trickle into the Eypc line up also over time. Perhaps not this gen but next 2-3 iterations. ]

You can supplement the sound of the Mac Pro 2019. But you can't directly upgrade the standard sound that ships soldered down with the device.



Storage options via Pcie Upgradable.

Even the one slot wonder prototype could have taken one of these.




Which is suggestive that Apple could have been thinking about a one-slot-wonder box. "Mc/Mac Fiver " in a x8 PCI-e v3 slot (possibly provisioned from a x4 PCI-e v4 lane bundle. )




Memory to user's needs. Upgradable.
More than 1 operating system can be booted ( windows ) Upgradable.

M chip Mac pro

Fixed CPU any failure means new board.
Fixed Memory again any failure means new board.
Soldered Storage ( external available via thunder bolt ) internal storage fails new Board.


Every high end Mac from 2017 onward ( last 5 years) has not had soldered NAND storage. None.
iMac Pro 2017 . No. Mac Pro 2019. No. Mac Studio. No.

Lots of hand waving and FUD about how the next Mac Pro is going to have unfixable NAND storage failure issues. Apple has demonstrated an extremely viable solution for that for half a decade. What is going to make them abandoned the working solution they already have?


Soldering in the LPDDR5 RAM in very close proximity to the CPU lowers the power consumption substantially.
From Nvidia material about their new Grace CPU (that also uses soldered on LPDDR5 )


NVIDIA-Grace-Bandwidth-HBM2e-DDR5-LPDDR5x-HC34.jpg



The notion that Apple's design decision here isn't 'buying anything' here is weak. Yes it is soldered but it also uses 1/8 the power. Power which they can apply to the CPU , NPU, GPU cores instead of the i/O overhead. It is pragmatically a "less expensive HBM" memory solution. Most of the HBM benefits at a more affordable price point.
If want to go slower and have a "lay away pay over time" approach then DDR5 is better. If super high capacity is critical, then DDR5 is better. However, if very high performance is a leading factor, that isn't as well motivated.


PCI-e v4 has a similar issue versus NVLink or UltraFusion with around a 8x power increase. The Mac Pro should be large enough that spending that much extra power shouldn't be a major blocker. But also somewhat likely not going to see > x40-60 lanes provisioned either. Likely going to be a trade-off for more lanes than the laptop SoCs, but no objective of getting into a "lane count" war with competitors either.


CPU/RAM failures where there is no user socket insertion issue , no overvolting or out of nominal operating specifications settings, or radiation (bit flip) failures are how common? Real documented numbers in a solid experimental methodology. And outside of an initial warranty period.



Fixed GPU any failures with that and again new board.

similar sold state failure rates . (not being high)


No external Egpu support.

This is not in any way shape or form a "Mac Pro" issue. It is a software driver issue associated with the Mac ecosystem and Apple priorities , future boot environment+security going forward, and primary objectives. There is no 'hardware' here at all.





No other operating system support

Virtual Machiens are being supported.

Possible 1 16X Pcie slot, but what hardware will it support? sound card possible if driver supported.

> 50 cards ( and counting. going up every quarter or so. ) already working.



All this requires you to have apple care, because any failure will be costly with new Board to original config. but after 3 years its risky as expensive to replace via your own pocket to original config. Do you really want or need to buy a new Mac pro every 3 years just to have warranty alone and an upgrade path.

Mostly misdirection. You don't get a longer warranty path if Apple uses highly modular parts or not. The Mac Pro's price point isn't going to change Apple's warranty policies.

Solid state parts tend not to fail.



That's is with NAND devices in SSDs that do wear over time. The number of SSD that go 'belly up' because the SSD controller (not the data or metadata on the drive) failed is dramatically even smaller.

Anyone buying used modular cryptomining abused GPUs isn't getting a better deal because it modular. Don't abuse the solid state components and the failure rates are quite low.



What the new Mac pro should be

AMD thread ripper CPU upgradable.

Extremely unlikely going to happen as macOS doesn't support the number of threads that the latest generation Threadripper provision. ( > 32 SMT cores is a problem and a mismatch to macOS ).




DDR 5 memory Upgradable.

See chart above.

AMD 7000 series Upgradable to 8000 series with in 3 years.

have to get over large ecosystem software hurdle here. Not really a Mac Pro issue.

Storage upgradable via TB 5 Pcie card offering 80 GB transfer speeds.

A generic PCI-e v4 x16 slot (or two ) would be better than that.

[ Some of the additional cost and complexity of the Mac Pro 2019 is driven by disintegrated TBv3 . Disintegrated TB is going to disappear over time. It will be like disintegrated nominal USB in the standard chipset. Not even Intel/AMD server I/O chipsets skip integrating USB. Thunderbolt is on same path long term. ]




Pcie Slots for sound and storage cards.

Yeah a rumored one-slot system would be weak. But Apple isn't far off track here; going from one to more than one shouldn't be a hard adjustment when shifting from M1 to M2 generation SoC. ( That M1 is highly likely never going to see light of day since Apple said the M1 line up was "done" back in March. )



Different operating system options for those who need it. Mac OSX isn't best at everything.

Buying a Mac primarily not to be a Mac is goofy. Doesn't make any fiscal sense to buy a system that isn't good at majority of stuff you want to use it for. That is a pretty good indication that your are buying the wrong thing.

Modern M-seres Macs can run VMs well enough to get around a substantial number of edge cases where macOS doesn't work as well.
 

Matty_TypeR

macrumors 6502a
Oct 1, 2016
638
548
UK
So if you have a socket that can take an upgraded CPU you shouldn't do it because of Apple? How many 4.1 and 5.1 and 6.1 and even 2019 machines have had CPU upgrades, frowned upon by apple or not its an easy up grade path.

Same as memory up grade path, 5.1, 6.1 and 7.1 can all have memory upgrades. as all previous Mac pro's could. the Exact same with GPU upgrades in previous Mac pro's

My whole point is the New Mac pro should also be upgradable, and as explained before VM windows is not the same as booting windows, it cant use all the system Ram or CPU's in VM booted windows. And goofy or not i do boot windows as not all OSX software is best for everything, simple as that, for a start it wont talk to our rip's

Main point is any new Mac pro has to be upgradable at least.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
VM machines are not as reliable as true booting into an OS simple as that,

There is over $1B per day of finance that transactions that says this is completely full of male cow droppings.
You can't buy a z-Mainframe that doesn't have VM . Real big iron that has to meet very high availablity service standards comes with VM by default.

VMs are reliable. What they don't work so well for are applications that do extremely quirky things with the hardware at low levels. That has nothing to do with the reliability of the VMs themselves and lots more to do with often the questionable quirks that are present in some applications.


AWS , Azure , Google Cloud , etc buckets of VMs that keeps hundreds of businesses up and operational every days. Not sure where macrumors is hosted at but... this whole forum could be provisioned off VMs.




having run parallel windows i know the difference when talking direct to plate setter rips for fine art printing at 220 dpi with high res files. VM is ok but you cant beat the real thing.

As if Parallels was the ultimate player in the virtual machine market. *cough*. Parallels is about a generation (or two or three depending upon how a broad a look at the VM solution space comparison done) behind what can be done with IOMMU and virtual I/O passthrough.

If these are "plate setter RIP" drivers that are 32-bit Windows drivers, then this is clearly not about reliability , but more so antiquated (likely zombie support ) drivers. In 2025 when the 32-bit Windows versoins get dropped, it isn't going to be a 'reliability' problem.

Even if 64-bit x86-64 drivers that are comatose , it still a moot point for either macOS or Windows on ARM. Again an 'old , effectively abandoned' driver problem more so than a reliability issue.


We shall see how well apple do when compared to dual 7900xt cards with 24g Vram each and much faster or if apple even allow the 7900 series to be supported as i cant see them allowing dual 7900 cards X2 with 96g of Vram as they would destroy the M2 series.

Apple isn't going to spend 600W to provision a solution. They didn't for the Mac Pro 2019 and very probably won't for the next Mac Pro either.

Covering single 7700-7800 ( and old 6800-6900 ) will be a significant share of that addressable market the MP 2019 is covering now.


And getting dual 7900XT into a Mac Pro 2019 isn't necessarily going to be supported under macOS either.


And then Apple would have to release AMD drivers that are on PAR with windows performance which they are far from with the current 6000 series.

Apple released new AMD drivers primarily to cover new Macs. If there are not new Intel Macs then it is likely there are no new AMD drivers coming. Folks have been booting into Windows to continue to track newer, bleeding edge Nvidia GPUs cards. Can track AMD going forward with same work around.

But if AMD isn't getting a major new contract for next generation , pragmatically embedded GPU dies then they are not likely to write drivers for them. ( some large payment from Apple needs to flow in. ) .


It dosn't bother me that my GPU might be 30 sec's slower in some tasks, but at least after 18 months i could upgrade it for the latest tech with out upgrading the whole machine.

The CPU socket in a MP 2010 or 2019 doesn't get you the latest technology CPU solution. Modular sockets do not in and of themselves get you "latest tech" compatibility over the long term. Likewise DDR4 sockets don't get you DDR5 memory. PCI-e v3 slots don't get you CXL 2.0 capable sockets either over next 2-3 years.

What hoping for is new tech that has been lassoed with a limiter that drags it back to the past, instead of forward.
 

Lihp8270

macrumors 65816
Dec 31, 2016
1,119
1,588
If there’s no Mac Pro announced this year it will be telling of where their performance level is in comparison to desktop workstation class machines.

Apple won’t announce a Mac Pro that can’t compete for raw performance with Intel/AMD/nVidia.

I’d say the only reason for them not to announce would be that they’re not confident they can win on pure performance.
 
  • Like
Reactions: mode11

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
No matter how you dress apple up to be a saviour to the work place or work loads, its still a closed system in the apple M chip ecosphere that apple hardware is best and all you will be able to use, like it or not. Yet there own API metal is thrashed by other GPU's even

This is nonsense comparing Metal to other GPUs. Almost textbook "Apples to Oranges" comparison. One is a software API and the other is hardware. How are they even remotely the same thing/classification ?

The handwaving you are perhaps looking for is some OpenGL or MoltenVK (Vulkan) set of apps that consistently thrashed Metal on macOS. Going to be looking long and hard for those since those get mapped down into Metal underneath those APIs at this point on macOS on Intel ( let alone on macOS on Apple Silicon ).

If trying to compare games ported to Metal versus DirectX 12-family API then again still relatively in the 'Apples to Oranges' land because also blending in a substantial set of OS and API mismatches.


The dual edge sword with Metal is that it pushes much of the responilbity to specific GPU optimization off into the applications themselves. There is a large upside there when the app developers actually put in tons of work to do the correct and optimal thing. There is a large downside there when app developers put in minimal efforts and/or have code highly tuned to old and/or different approaches to memory layout and allocation as well as cache usage.
[ Can see problems where Intel's "use iGPU assumptions" drivers ran into when applied to mid-range discrete GPUs with different memory latencies and caching impacts. The more systems made the dGPU look like a shared memory iGPU the better the performance got. (i.e., ReBAR on , less copying , etc. ) ]

A substantive subset of native iPadOS and iOS apps running on new Macs is a value add feature for lots of folks. And a neutral feature for even more folks. ( Windows is spending substatnive effort getting Android Apps running. The notion that 'nobody' wants this is deeply flawed as more folks have iOS/Android at this point than have mac/Windows. )


though apple promote the new metal as groundbreaking for games they also wish to control on apple devise's.

Even more hopeless hand waving. Apps which follow the Apple GPU scaling improvements outlined at WWDC 2022 and leverage tools announced at WWDC 2022 actually do so substantive performance improvements. Apple GPU drivers are not enitrely mature yet across the M-series implementations. M1 Ultra being the least mature. Once the education has been deployed across the apps for 6-12 months then can talk about how it "doesn't/didn't work". But at this point, there is no clear evidence that it does not. In fact, all the limited evidence so far points to that it does work.

Does Metal dig a moat around the Apple ecosystem? Yes. CUDA digs a moat around Nvidia. DirectX 12 digs a moat around Windows . " Pot calling kettle black " pretty much all around here. Oddly yes though, Microsoft hasn't deprecated OpenGL/OpenCL. They don't particular support , but also much less hinder Vulkan.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
If there’s no Mac Pro announced this year it will be telling of where their performance level is in comparison to desktop workstation class machines.

More so dependent upon what fab technology they put it on top of. If they used TSMC N3 it will slide into 2023 primarily because N3 slid into 2023. There is no other competitive desktop solution that is on N3. N3 won't necessarily bring an 'amazing' performance bump, but also likely not as deeply muted as A15 -> A16.

If on N5P and not in 2022 , then it is more likely because they are swamped with more work than have resources for. It isn't really performance as they are spread too thin. ( long term that would put tons of pressure to cut down the number of Mac SoCs they were doing and would be problematical for the Mac Pro as it would be an outlier edge case. Likely going to share an "Ultra class" SoC with the Studio and some other non-overlapping SoC that has only indirect economies of scale (volume to amortize over) support. )


If on N4 and not in 2022 then whether it is competitive or not could be more of an issue.


Apple won’t announce a Mac Pro that can’t compete for raw performance with Intel/AMD/nVidia.

Over the total product spectrum for those respective companies where they try to sell almost everything to almost everybody? No.

Over the middle of those respective companies product offerings? Yes, Apple extremely probably will compete.

The open question is how far to the upper fringe do they go. The more substantive question is how far do they need to go to have a viable Mac Pro product. Those are not necessarily the same limits.


I’d say the only reason for them not to announce would be that they’re not confident they can win on pure performance.

Apple's comparisons in the Mac space are almost entirely against previous Macs. Their approach is more so "is this the best Mac of this type then we have ever done". So primarily it is going to get couched against the predecessor.

Apple has already done comparisons of the M1 Ultra against the most common purchased configuration of the Mac Pro 2019 ( 16 cores and W5700). If there is a M2 Ultra in the new Mac Pro as an entry configuration it is almost guaranteed that it will beat the old entry point for the Mac Pro 2019.

If there is a M2 "Extreme" with 48 cores ( 32-P + 8-E ) and > 128 GPU cores there is pretty good chance it will beat a MP 2019 with 28 cores and a W6800 on a wide variety of benchmarks. On a MP 2019 with 28 cores , W6800, and a Afterburner card ... even more so on PreRes RAW 8K heavy workloads.


Do all potential users/buyers only compare with previous Macs ? No. But is Apple going to cower in a corner 'hiding' until they have some 'King Kong' that can beat any Linux workstation someone want to dream up? No. They don't primarily run their product announcements on what all other system vendors do.

Is Apple going cower until they can beat every possible MP 2019 BTO configuration ( throw everything and kitchen sink into the configuration)? Highly probably not. They didn't for the Mini ( they are still selling the Intel Mini almost two years later because there is a big hole in the line up due to substantive configuration backsliding. ). They didn't for the iMac. ( backslide on max RAM and external monitor support. Not even a 27" model at all. ). So the other two desktops they did not, so why would the Mac Pro labor under such a limitation? Apple hasn't given themselves that constraint. Laptops have been generally better no matter BTO option , but desktops is a substantively different track record.


If Apple can 'top' over 1/2 of the common configurations of the Mac Pro 2019 then they will likely check mark the Mac Pro as "complete" as far as the transition is concerned. Even more so if they can cover over 3/4 of them.



P.S. I suspect there is a small chance that the M1 Quad/Extreme didn't work quite right and they are going to 'skip' the problems it would have presented. It slid too long and just didn't make sense to produce it in large numbers with aging tech in the post pandemic impacted timeline.

But more so I think they were shooting for late Q4 2022 and some late '22 technology that just slide out from under them. ( e.g., MP 2013 waiting on Thunderbolt 2 which pragmatically contributed to a logistical log jam ) .
 
Last edited:

Boil

macrumors 68040
Oct 23, 2018
3,283
2,899
Stargate Command
Any new Mac pro with any M chip is not a true Mac pro.

Mac pro stands and always has for upgradability.

CPU Socket Upgradable.
GPU via Pcie Upgradable.
Sound card via Pcie Upgradable.
Storage options via Pcie Upgradable.
Memory to user's needs. Upgradable.
More than 1 operating system can be booted ( windows ) Upgradable.

CPU - to upgrade means to set aside a perfectly good CPU...
GPU - same, unless you mean adding in more GPU, which would be expanding on what one has, not upgrading it...
Sound card - onboard sound is usually better than the Sound Blaster cards of yesterday; anyone "upgrading" their Sound Card is most likely actually putting in some Audio I/O cards for Logic Pro / Pro Tools or the such...
Storage via PCIe - Mac Pros have had replaceable storage for awhile...
Memory - TBD, we need to see what Apple actually presents before we can judge it acceptable or not...
OS - back when the Mac Pro was called the Power Mac you needed an add-in card (which was basically a PC) to run Windows, but I would bet you say that does not count because it was not an Intel Mac Pro; well guess what, Intel CPUs were only ONE of FOUR chip architectures to be placed in Mac computers; 68xxx / PPC / X86 / ASi, and not a lot of people go into buying a Mac with the thought of how it might run Windows...

You want an Intel-powered box that runs Windows, buy an Intel-powered box that runs Windows, not an ASi-powered Mac that runs macOS...

M chip Mac pro

Fixed CPU any failure means new board.
Fixed Memory again any failure means new board.
Soldered Storage ( external available via thunder bolt ) internal storage fails new Board.
Fixed GPU any failures with that and again new board.
No external Egpu support.
No other operating system support
Possible 1 16X Pcie slot, but what hardware will it support? sound card possible if driver supported.

All this requires you to have apple care, because any failure will be costly with new Board to original config. but after 3 years its risky as expensive to replace via your own pocket to original config. Do you really want or need to buy a new Mac pro every 3 years just to have warranty alone and an upgrade path.

You really need to stop with the whole "totally useless after the three year warranty is up", because Apple also offers an ongoing yearly charge for AppleCare+...

Mac Pro AppleCare.png


If any of the above M chip spec ends up being true with non upgradable path for CPU, GPU, Memory it will not be a true Mac pro.

What the new Mac pro should be

AMD thread ripper CPU upgradable.
DDR 5 memory Upgradable.
AMD 7000 series Upgradable to 8000 series with in 3 years.
Storage upgradable via TB 5 Pcie card offering 80 GB transfer speeds.
Pcie Slots for sound and storage cards.
Different operating system options for those who need it. Mac OSX isn't best at everything.

ASi Mac Pro could be a backplane chassis, with one (or more) slots for the actual ASi SoC/RAM/SSDs, and three or four slots for assorted add-in cards...

New Mac Pro comes out, you order up a new ASi blade, not an entire new Mac Pro...

Slot the "old" ASi blades into a render/server farm chassis...

If there is a M2 "Extreme" with 48 cores ( 32-P + 8-E ) and > 128 GPU cores there is pretty good chance it will beat a MP 2019 with 28 cores and a W6800 on a wide variety of benchmarks. On a MP 2019 with 28 cores , W6800, and a Afterburner card ... even more so on PreRes RAW 8K heavy workloads.

I think you mean 32P/16E for the CPU core count; as for GPU, extrapolating from the M2 SoC we could see a 160-core GPU in a M2 Extreme SoC...?

But if Apple goes the "mixed chiplet" path (two "standard" Mn Max SoCs couples with two Mn GPU-centric SoCs) you speak of in other posts, we might see something like...

24-core CPU (16P/8E)
200-core GPU (40+40+60+60)
1TB LPDDR5X SDRAM

Then imagine Apple comes up with a way to have ASi GPU/GPGPU add-in cards work with the whole UMA thing, maybe dual or quad SoC models, meaning 120-core & 240-core models...

So a hypothetical four slot chassis (assuming the actual 'computer' part of the box is a blade that uses one of the PCIe slots) could have...

24-core CPU (16P/8E)
920-core GPU
4TB LPDDR5X SDRAM

But more so I think they were shooting for late Q4 2022 and some late '22 technology that just slide out from under them. ( e.g., MP 2013 waiting on Thunderbolt 2 which pragmatically contributed to a logistical log jam ) .

Maybe Apple is trying to stockpile LPDDR5X SDRAM chips of the 64GB variety...? ;^p
 
  • Like
Reactions: AlphaCentauri

throAU

macrumors G3
Feb 13, 2012
8,944
7,103
Perth, Western Australia
dual 7900 cards X2 with 96g of Vram as they would destroy the M2 series.
Again it depends on the working set.

If the working set is 500 GB, then 96 GB will not work effectively. Doesn't matter how fast the GPU is if it is starved for memory or can't get to it quickly because its on the other side of one or more PCIe slots.

The Mac Pro will likely have 1-2 TB/sec bandwidth or more if the current models are anything to judge by and that will be effectively used without dumb copies from pool to pool.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
OS - back when the Mac Pro was called the Power Mac you needed an add-in card (which was basically a PC) to run Windows,

Technically.

https://en.wikipedia.org/wiki/SoftPC

Wasn't a speed demon but it ran Windows. If wanted to run at near native time speeds then yes, there was a relatively very expensive add-in card didn't sell in high volume. Relatively recently Intel had a Windows on Atom in a USB fob product. Probably just as fast (if not faster) than that old card. :)

Could put it as system in a Thunderbolt 4 box that ran a TB port as a peer-to-peer 10GbE connection with Remote desktop. Don't really need a card for someone who is desperate for some random Windows apps.

ASi Mac Pro could be a backplane chassis, with one (or more) slots for the actual ASi SoC/RAM/SSDs, and three or four slots for assorted add-in cards...

If the backplane is frozen then it isn't going to track the PCI-e changes that are coming. If lock the backplane at PCI-e v3 then throwing away substantial properties at v5 and v6 ( since that is pragmatically also going to mean CXL coverage also in the high end world. )

There was a time 10-15 years ago when PCI-e was moving at a slow glacier pace. That probably isn't a good match for the next 3 years.


New Mac Pro comes out, you order up a new ASi blade, not an entire new Mac Pro...

But that really even save much of the bulk of the high cost items are on the add-in card?

The CPU tray of the 2009-2012 only provisioned 36 lanes. ( two x16 and one x4 what was switched/shared ). There is going to be slot bandwidth provisiong limits is try to push the SoC off the main logic board. Even less if have high end ports to provision ( front socket TBv4 , 1-2 10GbE ports , kick the "SSD modules off the card" , etc. )

If put most of the TB sockets and storage interface on the card it will run high in price.


... 48 cores ( 32-P + 8-E ) ...

I think you mean 32P/16E for the CPU core count; as for GPU, extrapolating from the M2 SoC we could see a 160-core GPU in a M2 Extreme SoC...?

Yes. 4 * 8 and 4 * 4 . (I missed an edit there) Where M2 generation does only a minor bump of the E cores to a full 4 core complex. (instead of half 'chop' of two cores to save some minor space.). Presuming that go to N4/N3 and 4 E cores is 'free'. If Apple is stuck on N5P then dies are likely "too big" and maybe the bigger dies don't get any core count increases.

There are 8 GPU cores in a basic M1 generation cluster. two clusters in Pro ( 16 ) . four clusters in a Max ( 32). (Ultra 64). So if bump the cluster by 2 cores ( enabled by a node shrink) then 10 , 20 , 40 , 80 , 160 . If this is a somewhat lame N5P implementation I kind of doubt going to see the 2 core bump. M2 also got a big bump in bandwidth shifting to LPDDR5. The Pro/Max/Ultra already are at LPDDR5. If there is no bandwidth increase or density increase I somewhat doubt doubt going to see the core count increase.

If there is a density increase ( N4/N3) but not much of a bandwidth increase then yes , but probably shooting for some corner case performance improvements. More so theoretical FLOPs than generally useful ones.


But if Apple goes the "mixed chiplet" path (two "standard" Mn Max SoCs couples with two Mn GPU-centric SoCs) you speak of in other posts, we might see something like...

24-core CPU (16P/8E)
200-core GPU (40+40+60+60)
1TB LPDDR5X SDRAM

Same reason M1 didn't start off with LPDDR5 is likely same reason that M2 doesn't start with LPDDR5X. Apple is quite likely going to want to use the same memory packages that the M2 Pro and Max use on any Ultra/Extreme SoC to get economies of scale on semi-custom memory package prices.

Past 512GB and no ECC is pragmatically wrong. It is possible to do but it has flawed priorities. Apple would loose likely as many customers as they hoped to gain with a stunt like that.

And 5-6 chiplets and not shrink down to TSMC N3 ... seems doubtful. Also likely has NUMA free communication problems as a GPU.

Then imagine Apple comes up with a way to have ASi GPU/GPGPU add-in cards work with the whole UMA thing, maybe dual or quad SoC models, meaning 120-core & 240-core models...

The huge problem is the more extremely exotic this SoC get the fewer they sell. The fewer they sell the harder it is to justifity the R&D to that goes into this niche solutions. Apple is likley going to be doing demand suppression by keeping the mandatory RAM prices high.

At some point it would just be easier to let GPGPU Compute accelerators back in via a software path instead of trying to build every SoC for everybody. Just leverage the general market that has far more units to amortize over even if the niche markets. ( 0.5% of 90% is still likely much bigger than the whole Mac Pro share of the PC/Workstation market. Let alone a narrow fraction of the Mac Pro market. ). "Embarrassingly Parallel" computations are not as sensitive to UMA or not because usually the data can be separated or replicated or both with little aggregate impact. ( There is nothing in the TOP 500 supercomputer line up that is a strict UMA architecture. Simulated UMA, but not physical . )

Apple just isn't going to have some 7900/4090 'killer' GPU any time soon.


Because of the dance that the Display controllers and GPU cores have to do to keep up with the real time requirements of delivering data to large resolution displays. UMA is more so UA-UMA. Uniform Access - Unified Memory architecture. Just having everything in a same 'Unified' address space isn't going help if can't deliver data on time . Two GPU tiles are hard. AMD isn't doing it for 7000 line up. ( memory controllers get split off but the GPU cores and main infrastructure is not. And on mid range models it is all monolithic. ) Intel hasn't make it work. Nvidia isn't even trying.

You are trying to put this off into a whole separate card with relatively imperceptible NUMA impacts ? Probably not going to happen. Apple could introduce a NUMA Apple GPU driver model but once again would both have to wait for applications to port to that also and it wouldn't be highly portable code to the rest of the Apple GPU ecosystem ( so very limited impact or return on investment for the app developers ).
 
  • Like
Reactions: Boil

Boil

macrumors 68040
Oct 23, 2018
3,283
2,899
Stargate Command
I kinda see two things the GPU does; display & render...

And when looking at benchmarks, it is usually focused on render times...

So if Apple can have a solid display GPU (the actual in the SoC GPU cores) for a fluid workflow, but also offer add-in GPU/GPGPUs for the actual render work...?
 

throAU

macrumors G3
Feb 13, 2012
8,944
7,103
Perth, Western Australia
My whole point is the New Mac pro should also be upgradable, and as explained before VM windows is not the same as booting windows, it cant use all the system Ram or CPU's in VM booted windows. And goofy or not i do boot windows as not all OSX software is best for everything, simple as that, for a start it wont talk to our rip's

That’s what you want. Most new Mac pro owners want something optimised for the workload they run on it. Which these days is predominantly high end video work.

Not sure you’ve run VMs on Apple silicon or not but I have (and have been running VMware workstation on PCs since v1.0, run several vSphere clusters, etc) and the experience is virtually same as native.

Even some windows laptop people are saying that running ARM windows in a Parallels VM on Apple silicon is in some ways the best windows portable experience due to performance/battery life and driver reasons.

Yes intel Macs running windows in a VM suck. This is no longer the case on M series. Performance is stellar.
 
  • Like
Reactions: AlphaCentauri

mattspace

macrumors 68040
Jun 5, 2013
3,179
2,879
Australia
So if Apple can have a solid display GPU (the actual in the SoC GPU cores) for a fluid workflow, but also offer add-in GPU/GPGPUs for the actual render work...?

Nope. If I have to buy a whole new computer to run more Displays, or a larger display, or there's an absolute ceiling of single digits in the number of displays it can drive, then the machine doesn't get considered.
 

mode11

macrumors 65816
Jul 14, 2015
1,318
984
London
similar sold state failure rates . (not being high)
[From post #32, referencing GPU failures]. Mac products have been littered with GPU failures - including the 2013 MP, but also numerous MBPs and iMacs.

Extremely unlikely going to happen as macOS doesn't support the number of threads that the latest generation Threadripper provision.
MacOS will obviously never support Threadripper, but is this thread limit in macOS some sort of hard limit? Why couldn't the macOS kernel just be developed to support more threads in future (if necessary)?

VMs are reliable. What they don't work so well for are applications that do extremely quirky things with the hardware at low levels
Historically, they've been terrible for leveraging GPU performance, typically only supporting a basic subset of features. This has made them unsuitable for stuff like 3D modelling and gaming. Pass-through techniques do seem to be changing this though.

The notion that 'nobody' wants this is deeply flawed as more folks have iOS/Android at this point than have mac/Windows.
Not sure that's particularly relevant. Sure, smartphones are very popular, but the ability to run apps designed for a small touchscreen on a desktop (or laptop) monitor doesn't seem like something that's generating lots of buzz.
 

fiatlux

macrumors 6502
Dec 5, 2007
351
139
Regarding upgradability, I have used a fair number of workstations in my professional life, and I don't remember any having been upgraded significantly. The CPU, never, I don't think we ever changed a GPU either. Memory, sometimes, storage more often but usually in external enclosures, big storage being pushed to the SAN.

What I mean is that we usually dimensioned the machines with a fair bit of margin and then used those to the max. Maintainability was more important than upgradability but we had a fairly expensive support contract with next-day on-site intervention so we did not really care if we got a part or a whole machine replaced, as long as we could keep working.

Don't get me wrong, I loved tinkering with my own MP 4,1 flashed to 5,1 and it is amazing how much life I got out of it, with CPU and GPU upgrades. I just don't think that's the way most pros manage their IT. I believe it is more important for Apple to release Mac Pro machines that truly deliver for their target market (pro audio, video and photo?) than release upgradable machines that may or may not be competitive for other use cases.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,902
[From post #32, referencing GPU failures]. Mac products have been littered with GPU failures - including the 2013 MP, but also numerous MBPs and iMacs.

My comments mentioned data about components failing inside of systems they are design for the nominal parameters for the chips. People can always stuff components into systems that the parts were not designed for. Saying "well that fails" is a misdirection. Of course, a chip will fail if you hit it with a lighting strike while sitting on an exposed power circuit. ( chips sitting behind a surge protector do not have anywhere near the same failure rate).

The Mac Pro 2013 needed a far more advanced higher level thermal control systems for that level of thermal coupling. Apple didn't do that so they got failures. As much from degraded components on the board than just the chips themselves. ( similar to the failed serious of Nvidia GPUs years earlier where Nvidia got the specs wrong and had lots of chip packages with problematical logic board connections which resulted in failures. )

The vast majority of Mac x86 sold came with Intel iGPUs. There were no huge failure rates there at all. The later era ( 500 (Polaris) on forward ) AMD dGPUs ... basically the same thing.




MacOS will obviously never support Threadripper, but is this thread limit in macOS some sort of hard limit? Why couldn't the macOS kernel just be developed to support more threads in future (if necessary)?

The core of the macOS kernel is shared with iOS , iPadOS , tvOS , etc. What on earth do those systems need with a > 64 thread operating system? Nothing.

Windows has a semi-forked kernel for Windows Server (and Windows Worsktation high core count). Apple just killed macOS Server ( which wasn't even a separate kernel anyway). Windows Server and Linux have a substantively larger server market to service. The super high core count workstation coverage is a relatively 'free' side effect of that.

Besides that Apple largerly doesn't need it going forward. Dispatching to non general purpose CPU cores can get lots of parallel computations done. The CPU core complexes in modern Apple Silicon have a AMX processor in them that does matrix multiples extremely efficiently. So one CPU core (thread) dispatches to that co-processor and it does the work of several general cores. There is zero pressing need there for the OS to track/schedule more than one thread to get that work done. Similarly the GPU has 10-100x as many cores as CPU cores. The OS doesn't schedule/track those cores the same way as CPU cores either. NPU cores ... same issue. ( 4 * 16 NPU cores would be 64 cores all by themselves. Impact on basic kernel thread scheduling ... about zero. )

Apple largely sells single user systems. Most of the multiple CPU core count workload is single apps dispatching multiple threads to do 'divide and conquer' computational jobs done. If you match the specialized co-processors up with those kind of computational load then you take huge motivation off of any need to expand the thread scheduling past 64.

There are a narrow subset of apps that are CPU only in computation that benefit from > 64 cores/threads. But is that really the mainstream of Mac or even Mac Pro users? Probably not. There are some narrow edge case users Apple would liklely those, but a fair number of them would have already left for Threadripper even after the Mac Pro 2019 was announced. That really isn't 'new'. Nor something Apple thought was extremely critical (e.g., MP still with a dual CPU package after 2017-2020 ).

For high multiple users ( 64 users , running 64 different application states ) that is where have far less traction for keeping the kernel thread scheduling below 64 threads. The major "server" business Macs are in though is renting Macs in the cloud to individuals ( entry or persons). That is how macOS is licensed. There are no legal , large scale multi-tenant workloads to match a > 64 core server processor to. ( unlike Linux and other server OS options. )

Is is a huge technical problem? No. Do you have to somewhat fork the kernel? Yes. You have data structure growth (and bloat in the perspective of a iOS device) going past a 64-bit vector to track the active threads for scheduling. This causes a change propagation in the kernel. Scheduling gets trickier to do in real time (with time constraints ) because have more to schedule.

It is technically possible for Apple to write a CUDA clone and relase it for macOS. Are they likely to do it? No. Same for a DirectX 12 clone. > 64 threads is in basically the same boat. There is nothing in Apple's overall strategic interest in following that extra software development workload. It wouldn't hold my breath waiting on that 'technically possible' software to show up either.



Historically, they've been terrible for leveraging GPU performance, typically only supporting a basic subset of features. This has made them unsuitable for stuff like 3D modelling and gaming. Pass-through techniques do seem to be changing this though.

Historically macOS has pointed developers at multiple GPUs from multiple GPU vendors. So the optimization time budget is split over multiple GPUs types. That works better in the Windows market because there are larger number of possible software sales (unit volumes) to amortize bigger optimization expenditures. The Mac Pro is like 1-2% of Apples smaller sub 10% of that overall market ( a very small niche of an already small subset).

This is going to take transition time though. Apple needs developers to write new code. Metal being a 'thin' API pushes even more responsibility over the developers side. Which means even more work for them to do most of the adjustments. However, similar to the game console market can get very "bang for the buck" after all the narrow range hardware specific optimizations are done. It is a long term bet that will / will not pay off over a couple of years. ( no short term quick fixes here).



Not sure that's particularly relevant. Sure, smartphones are very popular, but the ability to run apps designed for a small touchscreen on a desktop (or laptop) monitor doesn't seem like something that's generating lots of buzz.

Way back in the day macOS has "desktop accessory" apps. They didn't have to be the same as a mainstream , full screen consuming app to be useful. Where the GUI is a huge , stumbling block mismatch those apps can be marked not to deploy to the mac App Store. Some will. More iPadOS optimized ones will work.


What software people already licensed ( 'bought' access to ) is very relevant. Inertia makes a substantive real difference in the overall market. It deeply impacts which apps they are going to want to pick. Stuff they already use everyday or something new. Given the choice most folks will want to use what they are already well down the learning curve on.

Same baseline reason why Apple send a tons of effort into Rosetta 2. Most folks already had x86_64 apps on macOS. Given those users easy access to the apps they already use only promotes adoption. If Apple has said "throw out your mac apps and get new ones" the macOS on Apple Silicon rate would have substantially slower.

Back in 2003-6 era the most common "Personal Computer" OS was Windows. Pragmatically now though iOS/Android is the primary usage " Personal computer device " OS for most folks worldwide. Viewing it from the Mac Pro user only viewpoint will miss the big picture. During most quarterly calls, Apple will comment about how many new Mac users they drew in to support growth. Not all that growth is coming from Windows at this point.

Apple doesn't try to make everything for everybody. There is always a subset of users to migrate out of the ecosystem.

Are "Handoff" and "Universal Control" super critical for all niche Mac Pro users no. Do they strengthen the overall ecosystem? Yes. Are there cases where it would be handy to pull all that onto one screen? Yes ( it is why Messages can folds data onto a single main screen ).
 
  • Like
Reactions: Boil

throAU

macrumors G3
Feb 13, 2012
8,944
7,103
Perth, Western Australia
Regarding upgradability, I have used a fair number of workstations in my professional life, and I don't remember any having been upgraded significantly. The CPU, never, I don't think we ever changed a GPU either. Memory, sometimes, storage more often but usually in external enclosures, big storage being pushed to the SAN.

What I mean is that we usually dimensioned the machines with a fair bit of margin and then used those to the max. Maintainability was more important than upgradability but we had a fairly expensive support contract with next-day on-site intervention so we did not really care if we got a part or a whole machine replaced, as long as we could keep working.

Don't get me wrong, I loved tinkering with my own MP 4,1 flashed to 5,1 and it is amazing how much life I got out of it, with CPU and GPU upgrades. I just don't think that's the way most pros manage their IT. I believe it is more important for Apple to release Mac Pro machines that truly deliver for their target market (pro audio, video and photo?) than release upgradable machines that may or may not be competitive for other use cases.

This is typically how big workstations are purchased and used.

By the time you might want to upgrade stuff in them there is typically a new better cpu, motherboard, memory standard etc. and the entire box is replaced.

Otherwise you bought the wrong spec in the first place.

And yes storage is generally either added externally or accessed over the network via a SAN.
 
  • Like
Reactions: AlphaCentauri

Boil

macrumors 68040
Oct 23, 2018
3,283
2,899
Stargate Command
This is going to take transition time though. Apple needs developers to write new code. Metal being a 'thin' API pushes even more responsibility over the developers side. Which means even more work for them to do most of the adjustments. However, similar to the game console market can get very "bang for the buck" after all the narrow range hardware specific optimizations are done. It is a long term bet that will / will not pay off over a couple of years. ( no short term quick fixes here).

May be the reason Apple is working so hard to get a "Full Metal" variant of Blender, a showcase for how the software can perform when properly optimized for ASi/Metal...?

Rumor is that there are some on the Blender forums who feel Apple may bring real-time ray-tracing to the Blender viewport, so that seems pretty positive...?

Come on Apple...! Just give us a preview of the ASi Mac Pro already...! ;^p
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.