Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

sirio76

macrumors 6502a
Mar 28, 2013
571
405
I agree that dual system are not optimal for typical workflow but there are still very few(very few!!!) users who may need that for some specific tasks. Also while not ideal for workstation, a dual Xeon option will be nice to have for the rackmount version.
 

Pro7913

Cancelled
Sep 28, 2019
345
102
I agree that dual system are not optimal for typical workflow but there are still very few(very few!!!) users who may need that for some specific tasks. Also while not ideal for workstation, a dual Xeon option will be nice to have for the rackmount version.

Both AMD and Intel dont support dual CPU for the workstation. They do have dual or multiple CPU configuration for Xeon SP and EPYC but they are for servers not workstations.
 

sirio76

macrumors 6502a
Mar 28, 2013
571
405
??? Here on planet earth there are many workstations that support dual Xeon, just go to HP or Dell(or whatever WS vendors) website and you will find plenty of them. The fact that a specific CPU is coinceived mainly for server usage doens't mean that can't be used succesfully for other tasks.
 

Pro7913

Cancelled
Sep 28, 2019
345
102
??? Here on planet earth there are many workstations that support dual Xeon, just go to HP or Dell(or whatever WS vendors) website and you will find plenty of them. The fact that a specific CPU is coinceived mainly for server usage doens't mean that can't be used succesfully for other tasks.

Because they are using XEON SP, NOT XEON W. Since Apple cares only for their software, they def wont gonna use Xeon SP series.
 

sirio76

macrumors 6502a
Mar 28, 2013
571
405
??? Nobody say that a dual Xeon W setup ever existed so I don't get your point... it was obvious that people here were talking about the SP version. This topic is to discuss if ther's a chance Apple will build a dual CPU system in the future, and I can't see a reason why they can't do it like anybody else, if they want to.
The Xeon W(3000 series) have exactly the same socket and you can litterally put a (single)Xeon SP inside a 2019 MP and you can be sure Apple software will run on it. If for whatever reason Apple will fill ther's a market for a dual CPU WS like in the past or a dual CPU rackmount unit(used as a server, render node or whatever) then they will just release a slightly modified version to accomodate dual socket. BTW Apple still support old dual Xeon for their application or any other software and there are quite a few that will benefit from larger core count(including rendering, large simulation, etc) than the W series, so ther's definitely a market for that, albeit very very small.
 
Last edited:

Pro7913

Cancelled
Sep 28, 2019
345
102
Then why do HP/Lenovo/Dell/SuperMicro/… sell dual socket workstations?

They are in a different market. Apple cares only with their own professional software like Final Cut Pro. Are you going to use Mac Pro for 3d or rendering? Oh well, Mac Pro doesnt even support Nvidia GPU for that.
[automerge]1571177418[/automerge]
??? Nobody say that a dual Xeon W setup ever existed so I don't get your point... it was obvious that people here were talking about the SP version. This topic is to discuss if ther's a chance Apple will build a dual CPU system in the future, and I can't see a reason why they can't do it like anybody else, if they want to.
The Xeon W(3000 series) have exactly the same socket and you can litterally put a (single)Xeon SP inside a 2019 MP and you can be sure Apple software will run on it. If for whatever reason Apple will fill ther's a market for a dual CPU WS like in the past or a dual CPU rackmount unit(used as a server, render node or whatever) then they will just release a slightly modified version to accomodate dual socket. BTW Apple still support old dual Xeon for their application or any other software and there are quite a few that will benefit from larger core count(including rendering, large simulation, etc) than the W series, so ther's definitely a market for that, albeit very very small.

I dont see any reason of using dual CPU for Mac Pro since a single CPU is powerful enough to work with. EPYC 64 cores is way cheaper and yet much powerful than dual Xeon SP.
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,676
The Peninsula
They are in a different market.
You said that AMD and Intel don't support dual CPU for workstation - they do. It has nothing to do with the "market".

It's valid to say that Apple doesn't see a market for dual CPU Apple workstations, but it is a falsehood to say that AMD and Intel don't support dual socket workstations.
 
  • Like
Reactions: StellarVixen

Pro7913

Cancelled
Sep 28, 2019
345
102
You said that AMD and Intel don't support dual CPU for workstation - they do. It has nothing to do with the "market".

It's valid to say that Apple doesn't see a market for dual CPU Apple workstations, but it is a falsehood to say that AMD and Intel don't support dual socket workstations.

AMD support dual CPU ONLY for server uses. Tell me if Threadripper supports dual CPU? NO.
 

StuAff

macrumors 6502
Aug 6, 2007
385
256
Portsmouth, UK
There are lots of dual Epyc workstations. Just because AMD calls it a server processor doesn't change that. Threadripper is a workstation processor according to AMD.com. Which must be news to all the gamers using TR.....
 

danwells

macrumors 6502a
Apr 4, 2015
783
616
Xeon W in the Mac Pro context is socket compatible with Xeon SP - there might be a minor firmware change needed... Confusingly, the name Xeon W is also used to refer to the iMac Pro processor that uses a smaller socket.

As far as the big socket goes, the Xeon W /Xeon SP divide is artificially introduced by Intel to charge (a lot) extra for the multi-socket server processors.
 
  • Like
Reactions: Zdigital2015

Flint Ironstag

macrumors 65816
Dec 1, 2013
1,330
743
Houston, TX USA
I dont see any reason of using dual CPU for Mac Pro since a single CPU is powerful enough to work with. EPYC 64 cores is way cheaper and yet much powerful than dual Xeon SP.
There are many reasons. You're choosing not to hear what people are telling you. There are plenty of people for whom a single CPU is not "powerful enough to work with". Every other workstation vendor on the market has figured that out. A simplified example:

64 cores @ 2GHz x 1 socket
32 cores @ 3GHz x 2 sockets (64 cores total)

Assuming your task utilizes all cores, which one will finish first?

This is not some edge case in tiny numbers. Go search ebay or newegg (just for starters) for dual socket HP Z8xx, HP Z6xx, and whatever the equivalent Dell product is. There are tons of them.

I think it's a misstep to go "Pro" starting at $6k and not offer a dual socket option at the high end. They would sell and lease in acceptable numbers, I bet. Seems like the accountants won that battle.
 

shaunp

Cancelled
Nov 5, 2010
1,811
1,395
NFW.

With 28 core single socket systems, dual sockets are now exclusively server territory. Workstation applications seldom scale to very many cores. The ones that are embarrassingly parallel (like rendering) are better suited for "scale out" farms than "scale up" to multi-socket workstations or servers.

Apple has shown how incompetent it is at making servers and enterprise storage, I can't see them making another go.

Not true. HP, IBM and Dell all make dual socket workstations. You are right about Apple being incompetent when it comes to building hardware though.
[automerge]1571210594[/automerge]
Whether Apple build a dual-socket workstation is really up to the professional market and whether they buy enough Mac Pros to warrant a dual-socket model. The architecture could support it (probably with less add-in-cards), but the case, PSU, etc could support a motherboard variation that had two CPU sockets.
 

sirio76

macrumors 6502a
Mar 28, 2013
571
405
Are you going to use Mac Pro for 3d or rendering? Oh well, Mac Pro doesnt even support Nvidia GPU for that.
This shows exactly how little you understand about what you are talking about. GPU rendering is only for a relatively small market compared to CPU(despite Nvidia desperate marketing attempts that tells you GPU renderer are 1000 times faster than CPU). Tell me about a single Hollywood blockbuster movie made using GPU for final render? There is a reason why you won’t see that anytime soon...
Like many other users I’m going to render on my next machine, using mainly CPU like I’ve done for the latest 20years(here is one of my scene: https://forums.macrumors.com/threads/highway-robbery.2199405/post-27800842 , no way you can render that on GPU since it lacks sufficient RAM and features). Even if GPU renderer become suddenly viable for every kind of scene both Octane and Redshift(the biggest player among GPU engine) will support AMD cards.
So yes, we will use MP for rendering and the (eventual) lack of an Nvidia option doesn’t matter. Also Mac hardware is not only for running Apple software, this is just ridiculous. Apple actively works with other developers and support other software, including Maxon, Adobe, Otoy, SideFX, The Foundry, Black Magic, Autodesk... and well.. most of the other creative software producer.
 
  • Like
Reactions: thisisnotmyname

skippermonkey

macrumors 6502a
Jun 23, 2003
624
1,536
Bath, UK
To my knowledge, the only film VFX done on GPU was Josh Trank's awful Fantastic Four, back in 2015, which was clearly a marketing exercise for OTOY. Most VFX studios do use Nvidia cards but only for fast or near-realtime previews of work. I've tried GPU rendering and it was a complete faff – flaky, a pain to set up, buggy, crash and unreliable. Instead I just throw cores and time at it, plus a bit of denoising in C4D R21.This is my last bit of fun:


BTW, anyone buying a New Mac Pro gets a copy of Octane X: "The final and full commercial release of Octane X Enterprise Edition will be offered as a free license to customers purchasing the new Mac Pro." https://home.otoy.com/octane-x-wwdc2019/
 

Pro7913

Cancelled
Sep 28, 2019
345
102
This shows exactly how little you understand about what you are talking about. GPU rendering is only for a relatively small market compared to CPU(despite Nvidia desperate marketing attempts that tells you GPU renderer are 1000 times faster than CPU). Tell me about a single Hollywood blockbuster movie made using GPU for final render? There is a reason why you won’t see that anytime soon...
Like many other users I’m going to render on my next machine, using mainly CPU like I’ve done for the latest 20years(here is one of my scene: https://forums.macrumors.com/threads/highway-robbery.2199405/post-27800842 , no way you can render that on GPU since it lacks sufficient RAM and features). Even if GPU renderer become suddenly viable for every kind of scene both Octane and Redshift(the biggest player among GPU engine) will support AMD cards.
So yes, we will use MP for rendering and the (eventual) lack of an Nvidia option doesn’t matter. Also Mac hardware is not only for running Apple software, this is just ridiculous. Apple actively works with other developers and support other software, including Maxon, Adobe, Otoy, SideFX, The Foundry, Black Magic, Autodesk... and well.. most of the other creative software producer.

That's purely wrong. How come a lot of Mac users left Mac Pro? Because it doesnt support Nvidia GPU. Mac Pro is powerful ONLY for their own software.
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,676
The Peninsula
Not true. HP, IBM and Dell all make dual socket workstations. You are right about Apple being incompetent when it comes to building hardware though.
You're right. I'm not sure what I was trying to say - obviously there are dual-socket workstations.

Since there is no real distinguishing line between workstations and servers, maybe I was referring to "workstation applications" (apps with a GUI) and "server applications" (mostly headless apps).
 

sirio76

macrumors 6502a
Mar 28, 2013
571
405
That's purely wrong. How come a lot of Mac users left Mac Pro? Because it doesnt support Nvidia GPU. Mac Pro is powerful ONLY for their own software.
That's the only argument you have? why don't you try to answer my question? Anyway, it's clear that you never get tired of being wrong... so I'll stop arguing with you since your lack of knowledge is embarassing and I've better things to do with my time than trying to explain to you how things are going in the real world.
 
Last edited:
  • Like
Reactions: thisisnotmyname

Pro7913

Cancelled
Sep 28, 2019
345
102
That's the only argument you have? why don't you try to answer my question? Anyway, it's clear that you never get tired of being wrong... so I'll stop arguing with you since your lack of knowledge is embarassing and I've better things to do with my time than trying to explain to you how things are going in the real world.

Then I guess you are not using PC at all.
 

sirio76

macrumors 6502a
Mar 28, 2013
571
405
I've more PC than Mac here, funnily none of them with Nvidia GPU since I've no use for CUDA(I use Cinema4D, VRay, Adobe CC, Affinity, Nuke, 3D coat, Photoscan, AutoCAD, FormZ, Marvelous Designer, ZBrush and many other). Again is clear that you have no real argument to support your claim so let's stop this discussion here;)
 

Flint Ironstag

macrumors 65816
Dec 1, 2013
1,330
743
Houston, TX USA
Apple has shown how incompetent it is at making servers and enterprise storage, I can't see them making another go.
Not sure what you're getting at here. The Xserve was 1U and didn't stretch to quad socket, 5U configurations, but handled its target market fine. I've deployed more than I can count over its product cycle and never had any issues. Redundant power supplies, dual socket Xeon, ECC, RAID, PCIe, lights out management, etc. Easy to work on and parts readily available. It was a proper 1U server that was a joy to look at and work with.

I'm sure they could have had their partners upgrade the Xserve RAID design, but they ceded that to Promise, who are still going strong in Mac enterprise storage.

I'm not seeing incompetence in those products.
 

Pro7913

Cancelled
Sep 28, 2019
345
102
I've more PC than Mac here, funnily none of them with Nvidia GPU since I've no use for CUDA(I use Cinema4D, VRay, Adobe CC, Affinity, Nuke, 3D coat, Photoscan, AutoCAD, FormZ, Marvelous Designer, ZBrush and many other). Again is clear that you have no real argument to support your claim so let's stop this discussion here;)

You just proved yourself being a lack of knowledge since you dont even use Nvidia GPU. Do you even aware that Nvidia took more than 70% of market share while AMD is less than 20%?
 

thisisnotmyname

macrumors 68020
Oct 22, 2014
2,438
5,251
known but velocity indeterminate
You just proved yourself being a lack of knowledge since you dont even use Nvidia GPU. Do you even aware that Nvidia took more than 70% of market share while AMD is less than 20%?

I don't know Sirio but I've seen his work and it's gorgeous. He's a professional, this is his livelihood. Quote market share stats all you like (without citation and no indication that they are specific to the industry in question we can only assume that's across all buyers of discrete graphic cards, at that point we may as well throw in the number of computers sold with on board graphics and crown Intel the king of GPUs) but he's providing real world data from the industry you were trying to uphold as your example. Pretty tough to take you serious in this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.