Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

CWallace

macrumors G5
Aug 17, 2007
12,159
10,925
Seattle, WA
In the past when I read reviews of the Mac Studio, it was my impression the performance improvement of an Ultra over a Max version of a chip was very modest...If two Ultras get put together to make an Extreme version of a chip, is there any basis to believe the Extreme's 'step up' in performance over an Ultra will be much greater than an Ultra's over a Max?

Apple could design the Ultra as a true "workstation class" SoC like the Intel Xeon or AMD Epyc so that it would scale more linearly in a dual-SoC configuration than the current Ultra, which is two "consumer" SoCs lashed together.
 
  • Like
Reactions: drrich2

citysnaps

macrumors G5
Oct 10, 2011
12,023
26,060
Hardly a failure, just an expensive niche.

There's a diminishing curve of users to consider here.

The number of users who need 500GB of RAM is really small.

The number of those users who need 1000GB is a fraction of those people.

The number of that fraction users is huge in comparison to the puny number of users who need 1.5TB of RAM.

Over 500GB, we quickly hit such a small market that Apple might be right not to care about them.

Apple didn't really need to support 1.5 TB of RAM, and I suspect they regret offering it now. It was just free from Intel, and allowed them to get a few whales to spend a ton of money. Apple never cared about users who needed 1.5 TB of memory.

Spot on. The MacPro will continue to do well for those who need PCIe slots and a robust power supply in a variety of scientific, industrial, defense, professional music creation/production, etc disciplines. Especially in its rack mount configuration.
 

theluggage

macrumors 604
Jul 29, 2011
7,589
7,689
It’s incredible how much Apple was able to lower the price of the M series Mac Pro from what a maxed out version of the Intel Mac Pro used to cost:

Not a good comparison. That $52k price for a 2019 Mac Pro maxed out everything including things simply not available on the 2023 - including 1.5TB RAM costing $25k (2023 tops out at 192GB) and $9600 worth of GPUs. I'm not mentioning Afterburner cards since the M2 Ultra's media engines make those obsolete. The M2 Ultra probably does thrash the Xeon W 3275M processor on CPU power, but that "M" suffix doubled the price of the processor (ISTR the processor alone was about $7k) purely for the sake of supporting more than 1TB of RAM.

The $7000 2023 Mac Pro is "just" a $4000 Studio Ultra in a big, grotesquely over-engineered box with some (empty) PCIe slots - that can't take GPUs - driven by the surplus SSD controller on the second M2 Ultra die. If you need fast internal PCIe SSDs, specialist networking or AV production cards then those slots offer far more bandwidth than an external Thunderbolt enclosure hung off a Studio.

If you don't use non-GPU PCIe cards, you don't need the 2023 Mac Pro - its partly expensive because its niche (apart from the $400 wheels, which are just Apple having a laugh).
 

nathansz

macrumors 65816
Jul 24, 2017
1,279
1,457
It’s incredible how much Apple was able to lower the price of the M series Mac Pro from what a maxed out version of the Intel Mac Pro used to cost:

That’s a terrible comparison

A maxed out Intel Mac Pro was a significantly more powerful, useful and upgradable computer than any other Mac available at the time

Apple silicon Mac Pro is just a studio with pci in a big box for more money
 
Last edited:

theluggage

macrumors 604
Jul 29, 2011
7,589
7,689
Mac Pro should:
  • Support GPU cards
  • More than 2 SSD slots, at least 4 but ideally even more
So far, the Apple Silicon concept has hinged on having the GPU and specialist "engines" on chip sharing unified RAM. It does best in Apple-Silicon-optimised applications that benefit more from the GPU having direct, high-bandwidth access to tens of gigabytes of fast RAM than a faster GPU or more non-Unified RAM. U-turning on GPUs would throw away the Unified RAM advantage - and would end up with much the same performance as PCs using the same AMD GPUs (and many of the people asking for GPUs really want NVIDIA which probably ain't gonna happen for political reasons).

If you want a big box'o'slots filled with AMD/NVIDIA GPUs for standard CUDA/OpenCL/OpenGL/whatever workflows then frankly the M-series isn't the tool for the job c.f. a Xeon or Threadripper box - and I don't think the powerful personal workstation is a growth area that would justify Apple developing a more conventional processor. It's worth looking at what NVIDIA is doing for high-end AI applications - which has taken a somewhat Apple Silicon-like direction - and there are rumors that Apple are looking at a high end AI-focussed chip.

As for SSDs - if you have as many PCIe slots as the 2023 Mac Pro, and no GPUs to fill them with, you can add PCIe-to-M.2. SSD cards until the cows come home.
 

v3rlon

macrumors 6502a
Sep 19, 2014
897
713
Earth (usually)
What exactly can you put in a Mac Pro's PCI slot anyway? What can it do that a Studio Ultra can't?
There are a LOT of industrial applications.
Special I/O cards for controlling manufacturing equipment or reading data from same.
When you look at things like scanning electron microscopes, you see big equipment. When you open them up, there are several computers inside, often with multiple components that control different parts of the tool.

Very rarely do you see any of this controlled by USB except the keyboard and mouse. When you need 3nm accuracy, that USB-C doesn't look nearly as sexy as PCIE connections. When ultra precise control of time, again PCIE wins. If you need to control more than 255 parts, PCIE to the rescue. This may seem like a niche to you, but I guarantee you the money is good.

Show up at a vendor with $9 million dollars to ask for something 'pretty good' (not the latest greatest), and, because you are a loyal customer of a major international company, they will deliver it in 'only' 4 years - and that if you put it under contract for 3 years at additional 7 figure costs annually, and you walk away feeling like you just made a steal compared to industry standard.

People who need THAT need PCIE.

I have tools at work with racks of 18 Mac Pros in them. And we weren't the only ones that bought those.
 

DavidSchaub

macrumors 6502
Jun 16, 2016
441
493
Mac Pro should:
  • Support GPU cards
  • More than 2 SSD slots, at least 4 but ideally even more
GPU cards are always possible, and the hardware probably supports them, but I doubt we'll get OS support any time soon.

As for SSD slots... If you're buying an $8000 computer, springing $30 for:


... seems pretty trivial.

I agree that Apple could have put some M.2 slots in there, but it is super easy to mitigate.
 
Last edited:
  • Like
Reactions: krakenrelease

PsykX

macrumors 68020
Sep 16, 2006
2,449
3,268
It’s incredible how much Apple was able to lower the price of the M series Mac Pro from what a maxed out version of the Intel Mac Pro used to cost:


Hopefully the price can continue to come down with the second generation M series Mac Pro with all of the efficiencies that Apple will continue to improve upon in their manufacturing capabilities
It probably won't come down more than this, don't get your hopes up.
If anything, they should develop a better chip than Ultra and jack the price a little more.

But it's crazy to see how a maxed out Mac Pro 4X less expensive than ever before makes absolutely no sense anyways. The Mac Studio is $3K less if you max it out, this makes much more sense. Oh and you can even save an additionnal $1K by buying a Satechi hub and sticking an 8 TB SSD in it for half of Apple's price if you really need it.
 
Last edited:
  • Like
Reactions: Justin Cymbal

citysnaps

macrumors G5
Oct 10, 2011
12,023
26,060
There are a LOT of industrial applications.
Special I/O cards for controlling manufacturing equipment or reading data from same.
When you look at things like scanning electron microscopes, you see big equipment. When you open them up, there are several computers inside, often with multiple components that control different parts of the tool.

Very rarely do you see any of this controlled by USB except the keyboard and mouse. When you need 3nm accuracy, that USB-C doesn't look nearly as sexy as PCIE connections. When ultra precise control of time, again PCIE wins. If you need to control more than 255 parts, PCIE to the rescue. This may seem like a niche to you, but I guarantee you the money is good.

Show up at a vendor with $9 million dollars to ask for something 'pretty good' (not the latest greatest), and, because you are a loyal customer of a major international company, they will deliver it in 'only' 4 years - and that if you put it under contract for 3 years at additional 7 figure costs annually, and you walk away feeling like you just made a steal compared to industry standard.

People who need THAT need PCIE.

I have tools at work with racks of 18 Mac Pros in them. And we weren't the only ones that bought those.

Yes to all of the above. And with the MacPro rack-mount version.
 

Boil

macrumors 68040
Oct 23, 2018
3,294
2,916
Stargate Command
What I would like to see in the next Mac Pro:
  • Monolithic M4 Ultra - Up to 480GB LPDDR5X (inline ECC)
  • Two-die M4 Extreme - Up to 960GB LPDDR5X (inline ECC)
  • Four NAND blades (end-user configurable w/third-party options)
  • Thunderbolt 5
  • ASi Compute cards (target-able compute/render nodes)
And for those who do not need PCIe slots:
  • M4 Extreme Mac Pro Cube
 

profdraper

macrumors 6502
Jan 14, 2017
379
284
Brisbane, Australia
Didn't follow MacPro too closely, but I think the two big issues I saw mentioned were limited RAM compared to Intel versions, and no GPU support. Sound like they're addressing the RAM. Hopefully skipping M3 is a sign that it was too late to fix the GPU support in M3, but that they're addressing it for M4. Not sure that that's something that would be limited to just the highest end, or if perhaps solving the issues involved opens up giving the whole line up the eGPU capabilities.
I don't see that the M series mac pros will ever support 3rd party GPUs
 
  • Like
Reactions: citysnaps

maxoakland

macrumors 6502a
Oct 6, 2021
763
1,096
It’s weird that Apple struggles so much to deliver such an obvious product. It’s like they just refuse to give people what they actually want and always has to do something weird or different that compromises the whole thing

In the case of the recent one, it’s not properly expandable. People were saying that it didn’t provide enough RAM for some advanced research and stuff like that
 

Nermal

Moderator
Staff member
Dec 7, 2002
20,682
4,115
New Zealand
If you want a big box'o'slots filled with AMD/NVIDIA GPUs for standard CUDA/OpenCL/OpenGL/whatever workflows then frankly the M-series isn't the tool for the job c.f. a Xeon or Threadripper box
Indeed, which is why a number of people were asking Apple to just release a new Intel-based Mac Pro. The OS already runs on Intel, the apps run on Intel, but Apple's bent on trying to shoehorn the "consumer" M chips in there instead.
 

maxoakland

macrumors 6502a
Oct 6, 2021
763
1,096
So far, the Apple Silicon concept has hinged on having the GPU and specialist "engines" on chip sharing unified RAM. It does best in Apple-Silicon-optimised applications that benefit more from the GPU having direct, high-bandwidth access to tens of gigabytes of fast RAM than a faster GPU or more non-Unified RAM. U-turning on GPUs would throw away the Unified RAM advantage - and would end up with much the same performance as PCs using the same AMD GPUs (and many of the people asking for GPUs really want NVIDIA which probably ain't gonna happen for political reasons).
Yeah but the problem with that is Apple hasn’t been able to deliver the necessary GPU power or high amount of RAM some people need

Apple silicon is great for most applications, but they could be permanently damaging their relationship with this market the same way they did with Final Cut Pro X, and that’s not easy to fix
 

krakenrelease

macrumors regular
Dec 3, 2020
121
118
There are a LOT of industrial applications.
Special I/O cards for controlling manufacturing equipment or reading data from same.
When you look at things like scanning electron microscopes, you see big equipment. When you open them up, there are several computers inside, often with multiple components that control different parts of the tool.

Very rarely do you see any of this controlled by USB except the keyboard and mouse. When you need 3nm accuracy, that USB-C doesn't look nearly as sexy as PCIE connections. When ultra precise control of time, again PCIE wins. If you need to control more than 255 parts, PCIE to the rescue. This may seem like a niche to you, but I guarantee you the money is good.

Show up at a vendor with $9 million dollars to ask for something 'pretty good' (not the latest greatest), and, because you are a loyal customer of a major international company, they will deliver it in 'only' 4 years - and that if you put it under contract for 3 years at additional 7 figure costs annually, and you walk away feeling like you just made a steal compared to industry standard.

People who need THAT need PCIE.

I have tools at work with racks of 18 Mac Pros in them. And we weren't the only ones that bought those.
What industry are you in ?
 

HobeSoundDarryl

macrumors G5
So far, the Apple Silicon concept has hinged on having the GPU and specialist "engines" on chip sharing unified RAM. It does best in Apple-Silicon-optimised applications that benefit more from the GPU having direct, high-bandwidth access to tens of gigabytes of fast RAM than a faster GPU or more non-Unified RAM. U-turning on GPUs would throw away the Unified RAM advantage - and would end up with much the same performance as PCs using the same AMD GPUs (and many of the people asking for GPUs really want NVIDIA which probably ain't gonna happen for political reasons).

Ummm, there is ANOTHER option and that is BOTH. No U-Turn necessary. Leave Silicon Graphics as is and also support optional graphics cards in slots. Those who believe there's something magical about Apple graphics don't lose a thing... including the ability to keep using Silicon graphics exactly as they would now. Those who believe they need some third party graphics could add those too.

Intel MBpros used to have both options: intel AND AMD graphics and a user could even select which they wanted to use for some task. This could be the same.

Slot-based computers sometimes will fill those slots with multiple graphics cards, just like they may fill them with multiple audio cards, multiple memory cards, multiple storage cards, etc. A card is just a card. If it can be slotted, don't rule it out to protect some kind of "ours and only ours" turf. Else, customers of pro towers SHOULD opt for PCs who are not deciding such things for their customers.

The change that Apple would need to roll with is giving graphics cards makers a way to have Silicon drivers for their cards and/or build some such support for that themselves. But that's no great leap- just software.
 
Last edited:

bigglow

macrumors newbie
Apr 25, 2024
3
1
I wonder how many commenting here will actually buy any Mac Pro.

I like the fastest of the fast but pro desktops are such a niche these days.
 
  • Like
Reactions: videosoul

ZZ9pluralZalpha

macrumors 6502
May 28, 2014
262
400
A question for the “in the know” users on this thread—for applications demanding $X0,000 in specialty hardware, is it still standard practice to run all of that on a standard off-the-shelf operating system? I know a depressing number of scientific instruments just glue a bunch of third-party controllers together with some amateurish Windows app, thus creating insultingly high prices for janky user experiences, but surely the more software-savvy fields such as scientific computing research would be customizing Arch Linux before they’d even bought their third CUDA card?

…actually, if this isn’t the case, I’m not sure I want to know :oops:
 

DavidSchaub

macrumors 6502
Jun 16, 2016
441
493
It’s weird that Apple struggles so much to deliver such an obvious product. It’s like they just refuse to give people what they actually want and always has to do something weird or different that compromises the whole thing

In the case of the recent one, it’s not properly expandable. People were saying that it didn’t provide enough RAM for some advanced research and stuff like that
There are investment and architectural tradeoffs that Apple is making.

Intel, AMD, and Nvidia can spend a lot of resources on high-performance computing, because they all have huge markets to sell in to. Apple's market size is so small, it must be quite an internal challenge to argue for burning a lot of money with limited RoI.

Had Apple gotten the M1 or M2 Quadra (4x M* Max) working, maybe most of this wouldn't have been as much of a problem.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.