Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ixxx69

macrumors 65816
Jul 31, 2009
1,295
878
United States
No. Metal ( and Vulkan and the latest 'next gen' ) graphics stacks are much more 'low level' than OpenGL is. There is really now more a presumption that most applications are going to use another 'portable' stack on top of the lowest level one ( Apple's Foundation libraries , QT , gaming 'engine' , company proprietary porting library , etc.).
Metal as a substitute for OpenGL requires a very substantive rewrite (or port to 3rd party) work.

Similarly, OpenGL (and DirectX ) had shader languages attached to them previously. Metal having a shader language doesn't really replace/merge OpenCL. Again folks can mutate their OpenCL solution to fit the Metal abilities but it is a substantive change. ( Khronos which manages the development of OpenGL/Vulkan/OpenCL do coordinate Vulkan and OpenCL so that compute aspects tend to get funneled through OpenCL, but it is a somewhat different approach. )
I don't even pretend to understand the intricacies of this, but I think you're response to this is from a very technical standpoint... for the poster you replied to, the simple answer is "Metal is kind of like OpenCL & OpenGL merged". That's the way wikipedia puts it...
wikipedia said:
Metal is a low-level, low-overhead hardware-accelerated 3D graphic and compute shader application programming interface (API) developed by Apple Inc., and which debuted in iOS 8. Metal combines functions similar to OpenGL and OpenCL under one API. It is intended to bring to iOS, macOS, and tvOS apps some of the performance benefits of similar APIs on other platforms, such as Vulkan (which debuted in mid-February 2016) and DirectX 12.
Now that may be a very simplified way of putting it, but unless you're actually a programmer using these API's, it's apt. And I don't think understanding the subtle technical differences lends further understanding to why the 2013 MP failed (though I'd be interested to hear otherwise).
It really wasn't a "bet the whole farm" in terms of actual capital investments.
Of course not capital... they bet the farm that the dual GPU's would propel the success of the 2013 Mac Pro. Was it the whole farm? Who cares, it's just a phrase. The point is that the dual GPU's were supposed to be the killer feature, and it didn't work out that way for a host of reasons.
There is also a decent chance that the "dual GPU" had a Scrooge McDuck motivation behind them also. In order to hit a higher volume of custom GPUs made, the limited number of folks who are going to buy get two. Part of Apple didn't want to be in the video card making business so cranked up the numbers to settle that debate. Need software to leverage it too (which is also an investment. Scrooge McDucks will moan on that too. )
This is interesting... could be... but would be kind of a weird approach to design a computer that needs custom GPU's, but because they know they can't sell enough Mac Pro's with just a single GPU to make it worth while, they design it so each computer comes with two. Could be... but might be a stretch.
But they did do the iMac and iMac Pro. The iMac Pro is probably a bit over inflated but that is as much air cover for the iMacs they are selling in the upper BTO range.
I can see my phrasing could be misconstrued, but what I was saying is that if they had taken the innards of the iMac and placed them in the form factor of the 2013 Mac Pro, it would have sold like hotcakes.
Doubtful that it is an xMac. The mini is far more so a 'bone' thrown at many of the xMac folks. Not that it will keep most happy but it is extremely doubtful Apple is going t spend all this time just to crank up the fratricide in the core iMac price zone.

Apple has been shifting to higher average selling prices so they aren't going to replace the Mac Pro with a range of system selling a far lower average selling price. All of Apples moves of late point in the opposite direction.
Of course doubtful, almost zero chance... hence the "super long odds"... but in the "real world" of computer usages, an argument can be made that it would make far more sense.

Apple is never going to sell high-volume Mac Pro workstation-class computers again... there's just not enough market for macOS running on workstation-class hardware. Probably 90% of the people who buy Mac Pros don't need it - they may mistakenly think they need it, or they may buy it because it's the best option (i.e. don't want an iMac). Sure, the profit margin is huge on Mac Pros... but if you get a 1/4 the profit on an "xMac", but sell 10 or even 20 times as many units, you're making more profit.

And it can't possibly be about the "profit" at this point... again, MP will never be more than a rounding error on their bottom line. At this point, the 2019 MP is more of a strategic product for the health of the Mac ecosystem than making money off direct sales (of course it has to still be profitable... Apple doesn't do loss leaders).

Again, it's just not going to happen - Apple's design approach to every computer is to create a "solution" they don't think anyone else offers like they can (some would say they make it different for the sake of making it different to the detriment of the product... and that's a fair point a lot of the time).
 

namethisfile

macrumors 65816
Jan 17, 2008
1,186
168
I realize that this is a quote and not your words - but the MP6,1 thermal core is anything but balanced.



There are literally twice as many fins cooling the CPU as there are fins per GPU. Hmm, and the GPUs often burn out. ;)

It shouldn't matter if it looks like the CPU has more fins pointing towards it since all 3 components share the same triangular thermal core.

The part where the quote mentions 'balance' is probably related to total heat output since the 'Thermal Core' probably has a total heat dissipation rating. So, let's say it is rated at 450 total watts. That means 450 divided by 3, which equals 150. That means all three components must only output a total of 150 watts max to be within the heat dissipation rating of the Thermal Core.

But, I don't exactly know what the total heat output/dissipation rating is for the 2013 Mac Pro's Thermal Core. But, one would assume it is adequate. Or, perhaps, due to GPU failures, they aren't?

If, it wasn't/isn't, why has Apple neglected to re-design and/or address it?

My own fail-safe method would be to add another fan in the bottom to create a push-pull config. It would make sure air is pushed up and out and prevent heat buildup from components that are outside the Thermal Core--the components facing out when opening the 2013 Mac Pro would get minimal airflow and could potentially risk heat buildup.

Image001.jpg


I don't know. I am not an engineer, though.

I never understood why the fan was at the top. It would make sense, if, there is only one fan that it would be at the bottom to make sure any heat buildup from those outer laying components on the opposite side of the Thermal Core are pushed out, instead of, pulled out...

Again, not a thermal engineer.

Also, perhaps there are dead zones in the design, like the center of the core that doesn't get airflow at all since all fans like the ones used have a dead zone at the center where the hub of the fan is.

Again, not a...

you got it!
[doublepost=1543638952][/doublepost]
I don't think there is one single cause for all of the failures. A sizable class though seems to be sustained workloads which light up one or both of the GPUs for relatively long periods of time along with moderate to high utilization of the CPU.

Too much coupling of heat sources combined with not quite enough airflow and no overall 'balancing' global fallback mechanism probably was/is a major contributing factors to those. It seems doubtful that Apple extensively torture tested the MP 2013 model primarily as a high performance computational node as opposed to a single user interactive "viewer" workstation.

Well, like my prior post, I think, the 2013 MP has a couple of obvious head scratching design elements from a thermal perspective, even, if one is not a thermal engineer as I pointed out in my prior post (#102).

There were software/firmware glitches too. Lots of stuff piled up, but the 'differentness' of the Mac Pro was an easy thing to point a finger at in the blame game.

It did look different. But, more than that it seems like an Aeronautical military NASA piece of kit. But, of course, just because it looks “advance” doesn’t mean it’s “perfect.”

Perhaps, the biggest achilles heel of the 2013 Mac Pro is looking "perfect." Maybe, if anything can be learned in the 2013 Mac Pro is that nothing is "perfect." And, of course, Apple could have tweaked the design, but they didn't to make it really "perfect." So, in that sense, it was either really the "perfect" design. Or, that Apple didn't feel like tweaking it because of what people say on here might actually be true in that Apple might have second or third guess whether or not to continue the Mac Pro.

No. Pragmatically Apple switched to metal because there wasn't a "open standard" that was evolving quickly enough (and with the priorities Apple wanted emphasized). OpenGL had some previous development stages where it turned into more of an exercise of herding cats and than a diligent, focused committee solving problems. So did their own and didn't ask for consensus input on direction. Nvidia was primarily pushing CUDA ( and slow rolling OpenCL to allow CUDA to develop more traction/inertia ). Microsoft as usual was not particularly interested in either OpenGL or CL ( they have alternatives to both to push; which muddles the water further ). Google Android was off hemming and hawing. Imagination Tech, Intel, and AMD played along with Apple requests, but they apparently were also open to Metal (to keep the Apple checks coming).

It makes sense for Apple to create their own API. It streamlines and paves a path down that road they wanna travel to.

And, even though travelers need to get used to new routes, or a new destination, it is sometimes better to start sort of from scratch but not really. And, do it again, but, hopefully do it better.


Similar to Thunderbolt where Apple pitched the notion to Intel and they sprinted off getting something done without trying to gather a large group into some sort of standard body. Standards are funny things. It isn't just technical issues. The timing has to be right for them to work. Too many cooks in the kitchen in the extremely earrly stages can mess can defeat consensus building. Too late in the process and others have cooked up their own alternatives to participate well.

I don’t know. I feel like they need TB4 to be done, too. For that “modular” Mac Pro.

Apple probably felt they didn't have 'time' for all of that courtship stuff. iOS needed to move to a "next generation" graphics stack sooner rather than later, so Apple just largely did their own based on some factors that were being discussed as to what to do "next' for OpenGL and some common baseline approaches for the video gaming systems ( Playstation and Xbox/DirectX ) , and AMD's Molten. (What was 'wrong' with OpenGL from the 3D library developer perspective was relatively well researched and disccused at that point). In 2012-2014 timeframe none of that was lining up for a solid well formed consensus, so Apple opted to just do it themselves ( i.e.., similar to Microsoft model where leverage your market inertia to get other folks to row to their cadence/design dictations. ) Vulkan and OpenCL 2+ eventually arrived in 2015-2016, but that was probably at least two years too late for Apple.

My understanding is that OpenGL is slow. And, perhaps, bulky.

Isn’t Metal suppose to also streamline coding and make use of computational hardware resources that OpenGL can’t? Like, next generation heterogenous compute stuff?

Metal incrementally picked up some compute/computational aspect but it isn't trying to be a "general usage' as OpenCL. ( it is a shader/shading language that is taken on few more general compute aspects ). That aspects Apple is probably a bit behind the alternatives in.

That might not be point of Metal today. It could be a two-stone, one bird thing with Metal was to make MacOS more responsive, thus making Mojave Metal GPU compatible only and frees them from redoing OpenGL code just to bring it to modernity….

No. Metal ( and Vulkan and the latest 'next gen' ) graphics stacks are much more 'low level' than OpenGL is. There is really now more a presumption that most applications are going to use another 'portable' stack on top of the lowest level one ( Apple's Foundation libraries , QT , gaming 'engine' , company proprietary porting library , etc.).

Metal as a substitute for OpenGL requires a very substantive rewrite (or port to 3rd party) work.

I know. Metal is akin to DirectX12 or Vulcan.


Similarly, OpenGL (and DirectX ) had shader languages attached to them previously. Metal having a shader language doesn't really replace/merge OpenCL. Again folks can mutate their OpenCL solution to fit the Metal abilities but it is a substantive change. ( Khronos which manages the development of OpenGL/Vulkan/OpenCL do coordinate Vulkan and OpenCL so that compute aspects tend to get funneled through OpenCL, but it is a somewhat different approach. )

Yeah, so, Metal doesn’t replace OpenCL, but sorta wants to get rid of OpenGL (I think). So, since Metal and OpenCL shares similarities, something can be done in either/or and it would be fine. Whereas, I think, OpenGL was too old and clunky, right?

GPU compute and classic GPU work all have to get along and 'share' the physical GPU so the management has to be merged at some point (both can't do whatever they want at any time they want. )

OpenGL/OpenCL versus Metal is pretty similar to putting a round peg in a square hole. For some apps the round peg going to be smaller than the square hole so it will fit with some small shim/scaffolding adjustments. For some apps the round peg is bigger than the square hole and just hammering on the peg isn't going to make it fit. ( When Apple just waves their hands and say "it is a simple port" ... it is mostly that ... hand waving. )

No, I am saying, OpenCL and Metal are both round and OpenGL is square…. wait… whatever… something like that….

I don't even pretend to understand the intricacies of this, but I think you're response to this is from a very technical standpoint... for the poster you replied to, the simple answer is "Metal is kind of like OpenCL & OpenGL merged". That's the way wikipedia puts it…

I understood it just fine.

Now that may be a very simplified way of putting it, but unless you're actually a programmer using these API's, it's apt. And I don't think understanding the subtle technical differences lends further understanding to why the 2013 MP failed (though I'd be interested to hear otherwise).

Well, it could be that an OpenGL workload is not as extensive as say an OpenCL workload, thus, an OpenCL extensive task are cranking and lighting up all computer cores in those GPU’s creating heat.

Since the 2013 Mac Pro with two GPU’s are marketed as compute computer units, it might be used in scenarios where the CPU, and both GPU’s are lit for compute OpenCL extensive tasks for hours on end or perhaps days. So, this is perhaps where he is coming from where Graphic API’s are related to GPU failures.

Of course not capital... they bet the farm that the dual GPU's would propel the success of the 2013 Mac Pro. Was it the whole farm? Who cares, it's just a phrase. The point is that the dual GPU's were supposed to be the killer feature, and it didn't work out that way for a host of reasons.

Or, you can look at it as it worked too well if we take the anecdotal evidence that GPU’s in them burned out.


This is interesting... could be... but would be kind of a weird approach to design a computer that needs custom GPU's, but because they know they can't sell enough Mac Pro's with just a single GPU to make it worth while, they design it so each computer comes with two. Could be... but might be a stretch.


I think the dual GPU thing is what made the 2013 Mac Pro, the 2013 Mac Pro, right?


I can see my phrasing could be misconstrued, but what I was saying is that if they had taken the innards of the iMac and placed them in the form factor of the 2013 Mac Pro, it would have sold like hotcakes.


You mean, if they updated the 2013 Mac Pro?


Or, just put a 7700K CPU and a Radeon Pro 580x in a 2013 Mac Pro chasis?
 
Last edited:

flowrider

macrumors 604
Nov 23, 2012
7,244
2,967
IMHO, you folks are missing the whole point of why the nMP failed and the 5,1 cMP is such a GREAT machine. It's not only expandable, but it's easily upgradable. Who'd athunk back in 2009, that we'd have NVME SSds, such advances in GPUs, USB 3.1 and etc? And these things can be easily added to the cMP? And they all can be PUT inside, no desk full of peripherals. And, again, IMHO eGPUs make absolutely no sense. The cMP's design allows us to upgrade the machine and make it relevant longer. AND, Apple doesn't like that. The nMP is a DEAD END! And those silly dual GPUs, what genius thought up that configuration? And that silly closed design. In my estimation the nMP is a true piece of CRAP.

Lou
 

deconstruct60

macrumors G5
Mar 10, 2009
12,311
3,902
I realize that this is a quote and not your words - but the MP6,1 thermal core is anything but balanced.



There are literally twice as many fins cooling the CPU as there are fins per GPU. Hmm, and the GPUs often burn out. ;)

The core issue there is not the number of fins. It is surface area that counts. The problem more so lines in the short fins (from this perspective) on the extreme left and right sides. Those have far less surface area ( which really can't be seen in this straight down view) than the ones in the middle. Part of the issue her is more Apple OCD symmetry disorder. That triangle is being placed so it fits in the "'bottom' half ( relative to how picture is oriented ) of the circle.

A triangle core where the GPU sides are are least as large CPU side would result in perhaps some more fins and larger areas ones that were not at the 'outer edge' anymore. That would have helped. It also would have helped to just make the core taller. (again can't be seen from picture but the core height is basically the board heights. That's a mistake. They could have added a thermal sink buffer by making it just an inch taller all around. ) With the bigger base circle to work with they probably could found room to put the SSD which would have decoupled a heat source off the back of the one GPU. ( and perhaps could have put two SSD slots in. ). However that probably would have left a decent chunk of 'empty space" on the non-triangle half of the circle and a bigger overall device. ( they could have spaced out the power supply a bit more though. )

The way it is put together it is in a sense balanced in how much bleed GPU1 (or GPU2 ) can leak over to the CPU side. If GPU1 is going full blast and GPU2 is mostly idle the CPU has an 'escape' to the other set of fins (relative to the CPU side). Likewise if the GPU2 is full blast and GPU1 is mostly idle the CPU still had an escape. The issue is not how much is coming of the CPU side. It is much surface are each GPU had access to ( can get more area by going higher as well as more fins). As long stick with the triangle your going to get bleed all the way across to some extent.

A bigger circle could have led to a bigger fan which could have moved more air. Again if more enough air over the given surface area you'll get cooling. Part of the issue was there less blower over those larger surface area fins since they were using the middle of the fan to largely broadcast Wi-Fi. A fan shield dome would have been better. Either Wi-FI at the top of the "dome" or Wi-FI into the dome pillars itself and then more air pulled straight up through the center of the core.

Instead of trying to shoot for a smaller than a Mac Mini footprint ( 7x7 inches ) they had gone 7-8" and 10-11" high (to keep same proportions) they would have had more slack in max thermal capacity without really being all that much bigger. ( would have been substantively heavier. )

It would have still diverged from the GPU hardware and software track over the next couple of years (given same limited to AMD solution options ), but would have had better ability to deal with overages by some unanticipated (by Apple) workloads.
 

namethisfile

macrumors 65816
Jan 17, 2008
1,186
168
The core issue there is not the number of fins. It is surface area that counts. The problem more so lines in the short fins (from this perspective) on the extreme left and right sides. Those have far less surface area ( which really can't be seen in this straight down view) than the ones in the middle. Part of the issue her is more Apple OCD symmetry disorder. That triangle is being placed so it fits in the "'bottom' half ( relative to how picture is oriented ) of the circle.

A triangle core where the GPU sides are are least as large CPU side would result in perhaps some more fins and larger areas ones that were not at the 'outer edge' anymore. That would have helped. It also would have helped to just make the core taller. (again can't be seen from picture but the core height is basically the board heights. That's a mistake. They could have added a thermal sink buffer by making it just an inch taller all around. ) With the bigger base circle to work with they probably could found room to put the SSD which would have decoupled a heat source off the back of the one GPU. ( and perhaps could have put two SSD slots in. ). However that probably would have left a decent chunk of 'empty space" on the non-triangle half of the circle and a bigger overall device. ( they could have spaced out the power supply a bit more though. )

The way it is put together it is in a sense balanced in how much bleed GPU1 (or GPU2 ) can leak over to the CPU side. If GPU1 is going full blast and GPU2 is mostly idle the CPU has an 'escape' to the other set of fins (relative to the CPU side). Likewise if the GPU2 is full blast and GPU1 is mostly idle the CPU still had an escape. The issue is not how much is coming of the CPU side. It is much surface are each GPU had access to ( can get more area by going higher as well as more fins). As long stick with the triangle your going to get bleed all the way across to some extent.

A bigger circle could have led to a bigger fan which could have moved more air. Again if more enough air over the given surface area you'll get cooling. Part of the issue was there less blower over those larger surface area fins since they were using the middle of the fan to largely broadcast Wi-Fi. A fan shield dome would have been better. Either Wi-FI at the top of the "dome" or Wi-FI into the dome pillars itself and then more air pulled straight up through the center of the core.

Instead of trying to shoot for a smaller than a Mac Mini footprint ( 7x7 inches ) they had gone 7-8" and 10-11" high (to keep same proportions) they would have had more slack in max thermal capacity without really being all that much bigger. ( would have been substantively heavier. )

It would have still diverged from the GPU hardware and software track over the next couple of years (given same limited to AMD solution options ), but would have had better ability to deal with overages by some unanticipated (by Apple) workloads.

I think a number of beneficial things could have occurred if the size of the 2013 Mac Pro grew even by only a few centimeters on both sides (length and width).

One most obvious benefit is the thermal capacity would increase. But, perhaps, in an engineering perspective, the size of the 2013 Mac Pro was probably calculated with cost, logistics, rigidity, executability, etc. in mind. Like, Apple came up with its own golden rule for the size of the 2013 Mac Pro. And, that it had to be that size for it to have existed.

But, we can't help, but, speculate since we're not engineers and Apple. And, so, yeah, if the 2013 Mac Pro grew, let's say an inch on both sides, it probably could have increased its thermal load/limit. That is just physics. But, it also couldn't have, as you mentioned, diverted its GPU vendor's fate. It would have still been stuck with the D-series lineup of GPU's. Nor, would it have saved the life of GPU's allegedly failing since we don't know how they failed. And, if they failed, whether the fault were thermal core size; thermal flow dynamics; etc....

Anyway, I was bored and drew a quick and dirty sketch of how (in my non-thermal-engineering mind) I think it would perform if fan configurations changed.

I also gather that this can be tested in real life as well quite easily.

macpro2018_05_NEW_4_WEB3.jpg
 
Last edited:

deconstruct60

macrumors G5
Mar 10, 2009
12,311
3,902
I don't even pretend to understand the intricacies of this, but I think you're response to this is from a very technical standpoint... for the poster you replied to, the simple answer is "Metal is kind of like OpenCL & OpenGL merged". That's the way wikipedia puts it...

wikipedia said:
Metal is a low-level, low-overhead hardware-accelerated 3D graphic and compute shader application programming interface (API) developed by Apple Inc., and which debuted in iOS 8. Metal combines functions similar to OpenGL and OpenCL under one API. It is intended to bring to iOS, macOS, and tvOS apps some of the performance benefits of similar APIs on other platforms, such as Vulkan (which debuted in mid-February 2016) and DirectX 12.

Now that may be a very simplified way of putting it, but unless you're actually a programmer using these API's, it's apt. And I don't think understanding the subtle technical differences lends further understanding to why the 2013 MP failed (though I'd be interested to hear otherwise).

It is slightly over simplified by you highlight the wrong part for this discussion ( "reasons why Mac Pro failed"). I've underlined the part of that quote that is most salient to the discussion. It is only some coverage. It is not a most equivalent substitute; what they are offering (especially in the computation dimension) is a smaller subset of what the alternatives offer.

It is more like an analogy that Apple is offering Black and White film for a market where folks are using Color Film. You can still take a photo with BW film, but it isn't color. If Apple went to pro photography market and offered a digital camera that took spectacular BW images, but no color, then they would have a problem.

Apple nuking OpenCL is an even worse path where they had a marginally different color digital camera (OpenCL vs CUDA) but now going to Black and White only ( Metal). Honestly Apple is basically handing CUDA(Nvidia) the portable computation business. CUDA were and are dominate (the Mac Pro 2013 was behind the curve), but they are basically blowing a large crater in the ground. They'll be pragmatically close to be required to put an empty slot into the new system if they even want to remotely stay in that dimension of the target space.

Tossing OpenGL and OpenCL under the bus in the run to a new Mac Pro is probably going to do some damage to this next system also. One of the reasons for slow adaptation of OpenCL by some folks was that computation kernels on x86 were just going to be more portable than trying to get onto graphics GPU foundation. Apps had access to an x86 family cores and large main RAM memory. Several of those folks didn't come out of being slow adopters when OpenCL on the Mac (and Mac Pro) wasn't a leader in OpenCL performance and stability. It failed at being a viable open, growing, stable portable platform. More of those folks jumped on the CUDA bandwagon than OpenCL.

OpenCL shouldn't have to fail for Metal to move forward. ( Frankly I suspect that Apple expected the same from Nvidia. OpenCL/Metal shouldn't have to fail for CUDA to move forward and that is contributing reason why they are in the long term dog house. But myopic attitude will more holistic system buyers of Apple's foundational components of those higher level systems. ). Apple's approach is going to help iOS more than macOS. For the folks who have building their own portability layer that won't be much of a problem to create something that layers on both iOS and macOS. For more technical apps that don't have a large base to spread costs over that probably won't work very well. ( Apple will again see slower than "expected" adaptation rates, but what they are doing is more impediment to cost effective software evolution than easing it. )


Of course not capital... they bet the farm that the dual GPU's would propel the success of the 2013 Mac Pro. Was it the whole farm? Who cares, it's just a phrase. The point is that the dual GPU's were supposed to be the killer feature, and it didn't work out that way for a host of reasons.

But one of those reason is lack of capital investment by Apple. Another contributing factor the Mac Pro's problems is that Apple even if you look at the iMac Pro as a "next iteration" partially for the target market is that it took 4 years to do anything. Apple was snoring away there for at least 2 years doing a whole lot of nothing productive. Even more mind boggling if couch it as a 2014-2019 span.

Apple hadn't decided to walk away from OpenCL they could have pressed to get an iteration of the Mac Pro onto OpenCL 2.0. Apple never made it past OpenCL 1.2

https://support.apple.com/en-us/HT202823

It would be one thing if they were walking away at 2.x , but they never made even to 2.0 ( Nvidia was more than partially a boat anchor on that along with the inertia of still active early Intel iGPUs ), but Apple could have done something to move forward. Stopped at 1.2 makes Metal "look better' with less of a gap, but that's cheesy.

If Apple isn't investing why should their partner software vendors invest ? That was a contributing factor to the slower than expect expansion of software solutions. Some of it is structural (e.g., software vendors presumptions built into their code). Some of it is inherent to the application. [ Apple's highly trailing edge OpenGL implementation hasn't really driven high app adoption growth either. ]


This is interesting... could be... but would be kind of a weird approach to design a computer that needs custom GPU's, but because they know they can't sell enough Mac Pro's with just a single GPU to make it worth while, they design it so each computer comes with two. Could be... but might be a stretch.

Thunderbolt works best with an embedded GPU. Apple has embedded GPUs in the rest of their line up. What the rest of the line up has is substantially more volume (and/or an "out of the box" solution for embedded that doesn't cost much; iGPU in the CPU package.). Don't being to sell "enough" Mac Pro was an issue before Thunderbolt even appeared. (in and of itself didn't create the volume problem). The system prices are generally higher which is a volume depressor all by itself.

Part of the issue is "mark up" and margin also. Throw on top that there isn't just one GPU but three ( which is another outlier from the rest of the line up.)



I can see my phrasing could be misconstrued, but what I was saying is that if they had taken the innards of the iMac and placed them in the form factor of the 2013 Mac Pro, it would have sold like hotcakes.

I think the problem for that specific approach was they couldn't. Vega doesn't present a dramatically better thermal envelope. They might have been able to go with the iMac Pro and put two fans and just one GPU in there ( with a second blower in slot where the "Compute GPU" goes to pull more heat out of the core from that side faster. ). Besids the annoying part of the iMac Pro capped RAM options would be even more glaring if a just as kneecapped ( power envelope ) and same components were sitting there in the line up next to it. ( likely far more fratricidal than than 'hotcakes' ). Also incrementally not taking into account the stumble and fall Apple did on docking displays ( screwed up the product management on those too along the way).


I think a Mac Pro that is different on a wider set of dimensions could share overlap with major components, because it is a different target audience.


Apple is never going to sell high-volume Mac Pro workstation-class computers again... there's just not enough market for macOS running on workstation-class hardware.

Again? Even in the 2009-2010 ere they didn't have a relatively (to rest of Mac market) high volume Mac Pro. Probably have to go back o the mid-course Power Mac timeline to find something that was relatively high volume.
 

rrl

macrumors 6502a
Jul 27, 2009
512
57
Hey, d60, a new years resolution suggestion: Parsimony.

Less is more. Don't make us beg.
 

namethisfile

macrumors 65816
Jan 17, 2008
1,186
168
It is slightly over simplified by you highlight the wrong part for this discussion ( "reasons why Mac Pro failed"). I've underlined the part of that quote that is most salient to the discussion. It is only some coverage. It is not a most equivalent substitute; what they are offering (especially in the computation dimension) is a smaller subset of what the alternatives offer.

It is more like an analogy that Apple is offering Black and White film for a market where folks are using Color Film. You can still take a photo with BW film, but it isn't color. If Apple went to pro photography market and offered a digital camera that took spectacular BW images, but no color, then they would have a problem.

Well, to make a comparison to the market/economic demography of Apple’s Mac Pro lineup to that of B&W photography vs. color photography is too black and white (pun intended), IMO. It is much too narrow-minded to simply say that the Mac Pro is an aging or dying or niche product. First of all, there are two kinds of B&W photography today—digital and film. The latter is a niche within a niche within, probably, another niche.... B&W film has persevered and resisted obsolescence by the very fact of its niche-ness.

To say that today, the world is in color and Apple is stuck in black and white is too easy to make and can be considered from you as a simple way of saying that they are niche. We get it.

But, all you need is a niche, just like B&W film, as we know.

Apple nuking OpenCL is an even worse path where they had a marginally different color digital camera (OpenCL vs CUDA) but now going to Black and White only ( Metal). Honestly Apple is basically handing CUDA(Nvidia) the portable computation business. CUDA were and are dominate (the Mac Pro 2013 was behind the curve), but they are basically blowing a large crater in the ground. They'll be pragmatically close to be required to put an empty slot into the new system if they even want to remotely stay in that dimension of the target space.

I am not sure if Apple is nuking OpenCL in favor of Metal, just like photographers didn’t nuke black and white film when color film was invented. Now, I know that 99.9% of the world probably see B&W film as dead. But, that doesn’t mean that .1% is meaningless or irrelevant. In fact, that .1% is probably the reason why people in art colleges are still learning to shoot, see and develop B&W film because a lot of our past were shot in B&W photos.

So, I don’t think that OpenCL is this and Metal is that. To me, they’re MetaLCL.

Meta Link Computer Language….

Tossing OpenGL and OpenCL under the bus in the run to a new Mac Pro is probably going to do some damage to this next system also. One of the reasons for slow adaptation of OpenCL by some folks was that computation kernels on x86 were just going to be more portable than trying to get onto graphics GPU foundation. Apps had access to an x86 family cores and large main RAM memory. Several of those folks didn't come out of being slow adopters when OpenCL on the Mac (and Mac Pro) wasn't a leader in OpenCL performance and stability. It failed at being a viable open, growing, stable portable platform. More of those folks jumped on the CUDA bandwagon than OpenCL.

Well, OpenGL can probably be thrown under the bus, TBH. Can’t we agree on that? I mean, why would Mac Users want OpenGL? For ported OpenGL games that run like crap on Macs? I don’t see why it can’t be thrown under a bus at this point.

But, OpenCL… OpenCL is a computer language that’s open. So, one can’t just throw it out of the bus, unless, you throw out the driver because the driver is probably partly made of OpenCL. Just kidding. I don’t know what I am talking about.

Ummm…. Isn’t that the reason why Apple wanted to make their own Computer Language? OpenCL was too slow to adopt probably because of its harder learning curve and from the tiny amount of googling that I did—I read an account that mentioned OpenCL being more embraced by programmers than data scientists (who preferred to use CUDA)…. So, with that anecdotal account that I read in some forum in mind—is Metal akin to CUDA in its simplicity? And, it seems to point that that could be what Apple is aiming and hoping for.

But, since Metal is new, OpenCL and probably CUDA still eclipses it in performance. So, this is why MetaLCL is probably the proper way to see this world.

Apple being Apple just dropped the CL part and called it Metal. An Apple marketing departmental tweak, sorta, speak.

OpenCL shouldn't have to fail for Metal to move forward. ( Frankly I suspect that Apple expected the same from Nvidia. OpenCL/Metal shouldn't have to fail for CUDA to move forward and that is contributing reason why they are in the long term dog house. But myopic attitude will more holistic system buyers of Apple's foundational components of those higher level systems. ). Apple's approach is going to help iOS more than macOS. For the folks who have building their own portability layer that won't be much of a problem to create something that layers on both iOS and macOS. For more technical apps that don't have a large base to spread costs over that probably won't work very well. ( Apple will again see slower than "expected" adaptation rates, but what they are doing is more impediment to cost effective software evolution than easing it.)

No it doesn’t as I have harped on above.

Metal will probably help MacOS, too. In that Apple is at the helm, the master of its own sea.

But one of those reason is lack of capital investment by Apple. Another contributing factor the Mac Pro's problems is that Apple even if you look at the iMac Pro as a "next iteration" partially for the target market is that it took 4 years to do anything. Apple was snoring away there for at least 2 years doing a whole lot of nothing productive. Even more mind boggling if couch it as a 2014-2019 span.

Apple hadn't decided to walk away from OpenCL they could have pressed to get an iteration of the Mac Pro onto OpenCL 2.0. Apple never made it past OpenCL 1.2

https://support.apple.com/en-us/HT202823

It would be one thing if they were walking away at 2.x , but they never made even to 2.0 ( Nvidia was more than partially a boat anchor on that along with the inertia of still active early Intel iGPUs ), but Apple could have done something to move forward. Stopped at 1.2 makes Metal "look better' with less of a gap, but that's cheesy.

If Apple isn't investing why should their partner software vendors invest ? That was a contributing factor to the slower than expect expansion of software solutions. Some of it is structural (e.g., software vendors presumptions built into their code). Some of it is inherent to the application. [ Apple's highly trailing edge OpenGL implementation hasn't really driven high app adoption growth either.

Well, Xcode is free.

So, say I am a software developer, I don’t expect Apple to give me money or help me develop a killer app for MacOS or iOS.

I think developing Metal is an investment in and of itself and its future is as certain as CUDA and OpenCL. Nothing more and nothing less, I think.

Thunderbolt works best with an embedded GPU. Apple has embedded GPUs in the rest of their line up. What the rest of the line up has is substantially more volume (and/or an "out of the box" solution for embedded that doesn't cost much; iGPU in the CPU package.). Don't being to sell "enough" Mac Pro was an issue before Thunderbolt even appeared. (in and of itself didn't create the volume problem). The system prices are generally higher which is a volume depressor all by itself.

Part of the issue is "mark up" and margin also. Throw on top that there isn't just one GPU but three ( which is another outlier from the rest of the line up.)


I am not exactly sure the point here. Yeah, Thunderbolt is more necessary if one has an iGPU because an iGPU is not that powerful.


And, the state of TB2 eGPU compatibility in 2013 Mac Pro is nonexistent? Or, Apple artificially crippled the 2013 Mac Pro from eGPU support as current TB3 Macs?


I think the problem for that specific approach was they couldn't. Vega doesn't present a dramatically better thermal envelope. They might have been able to go with the iMac Pro and put two fans and just one GPU in there ( with a second blower in slot where the "Compute GPU" goes to pull more heat out of the core from that side faster. ). Besids the annoying part of the iMac Pro capped RAM options would be even more glaring if a just as kneecapped ( power envelope ) and same components were sitting there in the line up next to it. ( likely far more fratricidal than than 'hotcakes' ). Also incrementally not taking into account the stumble and fall Apple did on docking displays ( screwed up the product management on those too along the way).


If what you are saying is why Apple didn’t release an iMac Pro in a 2013 Mac Pro chasis, then, I have already answered this scattered throughout MR.


Basically, Apple didn’t make a 2017 Mac Pro (same as iMac Pro) for $4000 or $4500 because Apple wants to start from scratch with a different vendor. And, the stars seems to be somewhat aligned in that the other GPU vendor have already released their newest, shiniest lineup and recently added a new more workstation-focused one called after one of Saturn’s moons and is even dressed to match one of Apple’s color choices, Rose something.

I think a Mac Pro that is different on a wider set of dimensions could share overlap with major components, because it is a different target audience.

Again? Even in the 2009-2010 ere they didn't have a relatively (to rest of Mac market) high volume Mac Pro. Probably have to go back o the mid-course Power Mac timeline to find something that was relatively high volume.

Well, that is because in the time of the Powermacs, a Powerbook and a Power Mac essentially used the same CPU architecture. It didn’t have the same market segmentation that Intel CPU’s have where you have a mainstream lineup and a lineup called Xeons.
 
Last edited:

orph

macrumors 68000
Dec 12, 2005
1,884
393
UK
lol

id love to know the history behind it one day, maybe ill look at some books from X apple workers some time (any one have any recommendations?)

What i relay want to see is the macpro shape with macmin level hardware and price :D
still think it looks cool

and relay money is why it failed, if they sold like hot cakes apple will have done what was needed to make it work.

the big mystery is why is it still sold today?
has apple got large support contracts ?
is it actually selling better than we think?
did intel/amd have piles of CPU/GPU's that there giving to apple like candy?
(have an image in my head of intel/AMD with piles of old stock just shoveling them in to the bite in the apple logo :apple: )
 

namethisfile

macrumors 65816
Jan 17, 2008
1,186
168
From what I have gathered, it seems that there is a prevailing thread as to why (for the most part) people think that the 2013 Mac Pro failed. And, perhaps, one other person here expanded the so called failure to include a kind of Rip Van Winkle effect, which is fair from a telescopic perspective looking out unto the landscape below that is Apple land. But, not everything in a telescope is viewable. We might be able to see far things closer, but not wider. And, if we look at the view with just our eyes to see a natural wider peripheral view of human vision, then we lack that tool necessary to find a subject. Everything is just one big object, sorta speak.

So, the question must ask both subject and object; And, ask if it is the subject that is a failure; or, if it is the object that is a failure; or, both….

It would be telescopic to say that it is a failure because it lacks SATA ports, PCIe slots, drive bays, etc.

In contrast, it is too wide a view to say that if the next Mac Pro returned to 2012 form that it would be a success.

Another way to say it is that it is too subjective to say it is failure because it doesn’t have PCIe slots.

Thus, it also too objective to say that if it does have PCIe slots that it will succeed.

The same dilemma can be seen on the Windows platform even if it has machines in every shape and size. And, in every price category.

The competition might be fiercer because of the number of manufacturers in the Windows arena. But, that is up to the manufacturer themselves to worry about. And, it is an arena in which totally ignores there is such a thing as a Mac. Whereas, Apple cannot ignore Windows exist.

With that said, Apple doesn’t have to cater to all. Its success is that it can create the 2013 Mac Pro.

Even, if 95% of the world thinks it is a failure.
[doublepost=1543910900][/doublepost]
What i relay want to see is the macpro shape with macmin level hardware and price :D
still think it looks cool

Hopefully, in 2-3 years we can buy a used one at Mac Mini price!

[doublepost=1543911324][/doublepost]
and relay money is why it failed, if they sold like hot cakes apple will have done what was needed to make it work.

the big mystery is why is it still sold today?
has apple got large support contracts ?
is it actually selling better than we think?
did intel/amd have piles of CPU/GPU's that there giving to apple like candy?
(have an image in my head of intel/AMD with piles of old stock just shoveling them in to the bite in the apple logo :apple: )

I think having iPhones as hot-ticket items helps the Mac Pro here. Sort of like, if one is an an artist who makes no money is helpful if they have a wife who does make money.

But, once in a while this artist gets a show and makes a lot of money in a few days. Probably how a ton of Mac Pro's might be moved for a production here, a movie there and then none for the most part, except for a couple of individuals here and there... I don't really know!

It's being sold and moved, I think, in quite a number of productions and offices and institutions and its size probably keeps it from the second hand market because they're easy to move around and find a second, third, or even fourth home and usage.

I don't think the 2013 Mac Pro still being on sale now has anything to do with back log of stock or anything like that. Least unlikely of the scenarios, I think.
 
Last edited:

rockyromero

macrumors 6502
Jul 11, 2015
468
147
With that said, Apple doesn’t have to cater to all. Its success is that it can create the 2013 Mac Pro.

Apple will create the 2019 Mac Pro and It will be expensive. Crazy expensive when deliveries start in 2020.

I have two 2013 Mac Pro and will buy another one in 2019, ramped up to 128G memory and an OWC 4TB SSD drive. That may tide me over until 2021 when prices start to lower for a used 2019 Mac Pro.

OWC will provide aftermarket memory and storage to wild heights. It’s possible that a fully maxed 2019 Apple Pro would extend to 512G memory and the newly annnounced 100TB SSD drives.

Or maybe a Hackintosh instead.

 

jscipione

macrumors 6502
Mar 27, 2017
427
242
The simple answer is that the Mac Pro was a failure because it sold poorly. Apple's ongoing quest for miniaturization over customization was applied to the Mac Pro to disastrous results. After the machine had proven a market failure, there was little reason to update it, so it languished for years, and here we are (5 years later).
 

deconstruct60

macrumors G5
Mar 10, 2009
12,311
3,902
The simple answer is that the Mac Pro was a failure because it sold poorly. Apple's ongoing quest for miniaturization over customization was applied to the Mac Pro to disastrous results. After the machine had proven a market failure, there was little reason to update it, so it languished for years, and here we are (5 years later).

The languishing had at least as much to do with the market the system was being sold into as it was the system itself. If there were 10M people per year wanting to buy a Mac Pro there would be little good reason to let it languish for years. The basic market were around three orders of magnitude smaller ( e.g., 10K ) then languishing ( for the minimal volume threshold that Apple has on products ) would be a reasonable response.

There have been two large gap periods. 2010-2013 and 2014-2019 . It isn't indicative of a rapidly growing and expanding market that Apple is passing up on purpose. Sales probably were particularly 'hot' in the first place. Folks like to point to "chicken and egg' ( Apple is slow so I'm slow ), but indications from the overall market are that buyers, high and low, are buying slower. It isn't a uniform rate at all scales, but the system usage cycles are getting longer.

To a large extent, low growth gets low effort from Apple in general.

The long wait lists for 2013 in the first 4-6 months of its roll out don't back up the "sold poorly" . It sold well enough for Apple to pursue an iMac Pro at some point. And well enough they didn't kill off the whole product entry. (e.g., XServe). Rightsizing Apple's expectations and efforts is far more the bigger issue. As Apple sold more and more Macs ( 5M/year -> 10M/year -> 20M/year ) while the Mac Pro space probably stayed constant (or relatively declined a bit) how is Apple going to jusitfy staying in the game when the standard practice to "invest in growth" ).
 
  • Like
Reactions: ixxx69

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,676
The Peninsula
The long wait lists for 2013 in the first 4-6 months of its roll out don't back up the "sold poorly" .
This really doesn't have anything to do with selling "well" or "poorly".

It simply means that there was a "pent up demand" for the product, and the initial supplies were short of that "pent up" demand.

This makes perfect sense to bean counters. Apple basically has three options in that scenario:
  • Start making the systems at a "sustainable rate" and stockpile a bunch so that the burst of initial purchases will be met from the stockpile. (Not good for bean counters, since the stockpile is "inventory" and inventory is bad.)
  • Overprovision the assembly line for initial demand, then cut back to a sustainable level when sales fall back to "normal". (Not good for bean counters, since the overprovisioning is expensive and there's bad press when you lay off the extra staff.)
  • Make systems at the expected sustainable demand, and make sure that the initial long waits are publicly blamed on "incredible demand" for the systems. (Good for marketing and bean counters. Especially good if you make sure that the systems that are initially available are well optioned and expensive. "Sorry, we don't have the 12 GiB quad core in stock, but we do have a 32 GiB hex core....")
Apple typically does the third option.
 
Last edited:
  • Like
Reactions: Nugget

jscipione

macrumors 6502
Mar 27, 2017
427
242
There have been two large gap periods. 2010-2013 and 2014-2019 .

Part of the reason for the gaps is the market and part of it has to do with how Tim Cook runs Apple. The earlier gap period had to do with Apple focusing on the iPhone at the expense of the Mac and dropping server related products while the second gap has to do with poor sales of the cylinder Mac Pro. Yes, initially demand for the machine was high as the market was desperate for a Pro Mac hardware but after the initial sales wore off the demand did too. When the product reviews came out showing real world Mac Pro setups it was clear that the Mac Pro did not produce good value compared to its predecessor. After a few years pro customers sold their Mac Pros and bought either an iMac or a PC. This is the period in which Premiere really started to overtake FCPX as video editors largely switched to PC.

Tim Cook, being an operations guy, prioritizes the biggest money-making products first. This means that the Apple comes out with a new iPhone every year, then prioritizes the Mac laptops, then the iPad, then the iMac, and then somewhere after that the Mac Pro and Mac Mini had fallen off the radar and were simply ignored. Finally in mid-2017 after years of neglect Apple decided that it needed to make one last Mini and one last Mac Pro. It seems like Apple must have redirected the iMac team this time since there was no new iMac in 2018.
 

namethisfile

macrumors 65816
Jan 17, 2008
1,186
168
The languishing had at least as much to do with the market the system was being sold into as it was the system itself. If there were 10M people per year wanting to buy a Mac Pro there would be little good reason to let it languish for years. The basic market were around three orders of magnitude smaller ( e.g., 10K ) then languishing ( for the minimal volume threshold that Apple has on products ) would be a reasonable response.

I would like to add that the core market, if it exist (and it does), doesn’t rely on annual cycles and annual performance bumps that are marginal at best that it is best to to let it languish (to use your own wording) than to let it anguish (my own wording).

I think it would have been more of an anguish had Apple updated the Mac Pro in 2016 when suitable Broadwell Xeons were released. It would be anguishing that the CPU would get a bump and not the GPU’s.

So, let it languish. Not for the lack of caring, per se, or focus. But, to avoid the anguish.

There have been two large gap periods. 2010-2013 and 2014-2019 . It isn't indicative of a rapidly growing and expanding market that Apple is passing up on purpose. Sales probably were particularly 'hot' in the first place. Folks like to point to "chicken and egg' ( Apple is slow so I'm slow ), but indications from the overall market are that buyers, high and low, are buying slower. It isn't a uniform rate at all scales, but the system usage cycles are getting longer.

Well, I am sure that data analytics for this has become very precise over the years that corporations can look at it and predict sales. And, Apple probably have these analytics, saw it, and said, “We’re good. We’re not anguishing, but languishing.”

Also, gaps aren’t gaps if there are products to fill them. Just because Apple doesn’t have a cMP 5,1 with iMac Pro innards doesn’t necessarily point to a gap or a desire unfulfilled or 5 guaranteed sales not ordered; or, that it is even Apple’s obligation to fill it.

To a large extent, low growth gets low effort from Apple in general.

That makes sense. Slow growth means, “languish over anguish.”

The long wait lists for 2013 in the first 4-6 months of its roll out don't back up the "sold poorly" . It sold well enough for Apple to pursue an iMac Pro at some point. And well enough they didn't kill off the whole product entry. (e.g., XServe). Rightsizing Apple's expectations and efforts is far more the bigger issue. As Apple sold more and more Macs ( 5M/year -> 10M/year -> 20M/year ) while the Mac Pro space probably stayed constant (or relatively declined a bit) how is Apple going to jusitfy staying in the game when the standard practice to "invest in growth" ).

These are the analytics Apple has their hands on. And, being a bigger and bigger Corporation with bigger and bigger ambitions means they have to move slower than even the speed of the growth, of which we can agree on is slowing.

This really doesn't have anything to do with selling "well" or "poorly".

It simply means that there was a "pent up demand" for the product, and the initial supplies were short of that "pent up" demand.

Demand and pent upness are actually more subjective than selling well or selling poorly because those are numbers that can be measured, whereas, there is no way of measuring pent up demand.

Part of the reason for the gaps is the market and part of it has to do with how Tim Cook runs Apple. The earlier gap period had to do with Apple focusing on the iPhone at the expense of the Mac and dropping server related products while the second gap has to do with poor sales of the cylinder Mac Pro. Yes, initially demand for the machine was high as the market was desperate for a Pro Mac hardware but after the initial sales wore off the demand did too. When the product reviews came out showing real world Mac Pro setups it was clear that the Mac Pro did not produce good value compared to its predecessor. After a few years pro customers sold their Mac Pros and bought either an iMac or a PC. This is the period in which Premiere really started to overtake FCPX as video editors largely switched to PC.

Tim Cook, being an operations guy, prioritizes the biggest money-making products first. This means that the Apple comes out with a new iPhone every year, then prioritizes the Mac laptops, then the iPad, then the iMac, and then somewhere after that the Mac Pro and Mac Mini had fallen off the radar and were simply ignored. Finally in mid-2017 after years of neglect Apple decided that it needed to make one last Mini and one last Mac Pro. It seems like Apple must have redirected the iMac team this time since there was no new iMac in 2018.

Like, I mentioned above about gaps, gaps aren’t meant to be filled in if the gaps are natural.

To me, the gap is not 2013-Present. It is more 2016-Present.

2016 was really when the gap became unnatural. At this time, Apple made a decision I think to hurdle the gap instead of fill it, or as I put it, “Languish over anguish.”

Tim Cook prioritizing iPhones over Macs is a meme at this point. One can’t really know unless one is in the know.
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,676
The Peninsula
Demand and pent upness are actually more subjective than selling well or selling poorly because those are numbers that can be measured, whereas, there is no way of measuring pent up demand.
I think that an initial spike of orders (and long shipment delays) is a pretty good metric for "pent up demand".
 

thefredelement

macrumors 65816
Apr 10, 2012
1,193
646
New York
It depends on who you ask, to me it's not a failure, I use it everyday. I don't regret buying it, I still don't even feel pressure to upgrade to a new machine.

I've had the same external things attached to it that I attached to my old Mac Pro, except I added an external 2TB thunderbolt drive, which combined with the Mac Pro is still a heck of a lot smaller than my old 2009. It's substantially more quiet and uses less energy.

It's paid for itself over and over.

Mine is used primarily for software development.
 
  • Like
Reactions: ixxx69

namethisfile

macrumors 65816
Jan 17, 2008
1,186
168
It’s a failure because....they wanted nmp to fail.

By "They," I assume, you mean Apple and not the "They," aka "the RIP Mac Pro 2006-2012 coalition...."

Or, the computer box with PCIe slots coalition...

Or, the HP Z840 but Apple Mac Pro Z840 coalition...

Or, the Rip Van Winkle One-Man coalition...

Or, you get the drift....

But, each of the coalition could be a little right. So, together, the right equals more than 50%, in which case, the case (pun intended) would be in favor of the coalition and that, indeed, "They (the coalition) wanted the nMP to fail in virtue of its aesthetics/design philosophy and its lack of updates between the years of 2013-2018.

Now, I would like to add my own coalition , called the "Wait a minute," or "Hold your horses, over there."

Wait a minute! And, hold your horse, there, sir because...

1) The cMP we saw updated at least 5 times ( MP1,1; MP2,1; MP3,1; MP4,1; MP5,1).

1a) These updates included a tweak of the cMP chasis to what would become the 5,1 chasis that everyone harps on here who hate the nMP as their "perfect" Mac Pro figure. This 5,1 chasis, I would like to add seems to have been revamped from the 3,1 chasis in order to facilitate ease-of-use even further. But, the internal layout between the MP 3,1 and the MP 4,1/MP 5,1 seems to have changed into the 4,1 almost organically, though. It was as, if, Apple cMP chasis designers had this wisdom built-in to the cMP chasis. But, it could just be considered lucky design, or they had a big enough chasis to work with to make the internal changes look seamless and organic. And, indeed, one can say, the cMP chasis is "big enough."

1b) The MP4,1/MP5,1 is so spacious that the middle fan in the PCIe slot compartment is off-set some inches inside the case from the front grill of the case.

2) We also saw three CPU architectural changes put in the cMP from 2006 to 2012. If, I am correct, they are Intel Woodcrest in the MP1,1; Intel Harpertown/Penryn in the MP3,1; And, then Intel Nehalem/Westmere in the MP4,1/MP5,1

2a) From 2010 to 2012, the cMP 5,1 skew continued when the 2012 MP's were introduced.

2b) So, basically, the MP5,1 lasted from 2010-2012.

2c) Since the MP4,1 share CPU architecture with MP 5,1, one can say that cMP from 2009-2012 remained similar; minor tweaks.

3) With my Wait a minute list and hold your horses there, sir, argument above, we can observe that Apple Mac Pro from 2006 to 2012 got one big tweak and the rest were minor tweaks, at the very least.

4) So, to conclude, it is not that unprecedented that the 2013 nMP didn't get tweaked at all from 2013-2018 when one looks at the big picture.

4a) So, Apple did not, in my opinion, neglect the nMP. It seems more par per course to the kind of machine that the nMP is.

4b) And, one can even say that Apple learned something from the cMP of 2006-2012. What that is seems to be that they didn't tweak it because they didn't have to.

4c) The thermal corner, Apple person talks about is probably just an excuse to say, we're doing something new (not just a tweak) so, hold your horses, there, sir!
 
Last edited:

pl1984

Suspended
Oct 31, 2017
2,230
2,645
By "They," I assume, you mean Apple and not the "They," aka "the RIP Mac Pro 2006-2012 coalition...."

Or, the computer box with PCIe slots coalition...

Or, the HP Z840 but Apple Mac Pro Z840 coalition...

Or, the Rip Van Winkle One-Man coalition...

Or, you get the drift....

But, each of the coalition could be a little right. So, together, the right equals more than 50%, in which case, the case (pun intended) would be in favor of the coalition and that, indeed, "They (the coalition) wanted the nMP to fail in virtue of its aesthetics/design philosophy and its lack of updates between the years of 2013-2018.

Now, I would like to add my own coalition , called the "Wait a minute," or "Hold your horses, over there."

Wait a minute! And, hold your horse, there, sir because...

1) The cMP we saw updated at least 5 times ( MP1,1; MP2,1; MP3,1; MP4,1; MP5,1).

1a) These updates included a tweak of the cMP chasis to what would become the 5,1 chasis that everyone harps on here who hate the nMP as their "perfect" Mac Pro figure. This 5,1 chasis, I would like to add seems to have been revamped from the 3,1 chasis in order to facilitate ease-of-use even further. But, the internal layout between the MP 3,1 and the MP 4,1/MP 5,1 seems to have changed into the 4,1 almost organically, though. It was as, if, Apple cMP chasis designers had this wisdom built-in to the cMP chasis. But, it could just be considered lucky design, or they had a big enough chasis to work with to make the internal changes look seamless and organic. And, indeed, one can say, the cMP chasis is "big enough."

1b) The MP4,1/MP5,1 is so spacious that the middle fan in the PCIe slot compartment is off-set some inches inside the case from the front grill of the case.

2) We also saw three CPU architectural changes put in the cMP from 2006 to 2012. If, I am correct, they are Intel Woodcrest in the MP1,1; Intel Harpertown/Penryn in the MP3,1; And, then Intel Nehalem/Westmere in the MP4,1/MP5,1

2a) From 2010 to 2012, the cMP 5,1 skew continued when the 2012 MP's were introduced.

2b) So, basically, the MP5,1 lasted from 2010-2012.

2c) Since the MP4,1 share CPU architecture with MP 5,1, one can say that cMP from 2009-2012 remained similar; minor tweaks.

3) With my Wait a minute list and hold your horses there, sir, argument above, we can observe that Apple Mac Pro from 2006 to 2012 got one big tweak and the rest were minor tweaks, at the very least.

4) So, to conclude, it is not that unprecedented that the 2013 nMP didn't get tweaked at all from 2013-2018 when one looks at the big picture.

4a) So, Apple did not, in my opinion, neglect the nMP. It seems more par per course to the kind of machine that the nMP is.

4b) And, one can even say that Apple learned something from the cMP of 2006-2012. What that is seems to be that they didn't tweak it because they didn't have to.

4c) The thermal corner, Apple person talks about is probably just an excuse to say, we're doing something new (not just a tweak) so, hold your horses, there, sir!
i think you're using a lot of twisted logic to arrive at 4a. The cMP saw regular updates that reflected changes in the available technology at the time each model was introduced. I see nothing preventing Apple from upgrading the nMP accordingly. I see nothing in the nMP design which would prohibit the configuration of E5 v3 and E5 v4 processors. I see nothing in the nMP design which limits its SSD to 1TB. I see nothing in the nMP design limiting it to TB2. The only design limit of the nMP might be the GPUs. But then why not remove one and use the recovered space / power / thermal cooling for a single variant?
 

namethisfile

macrumors 65816
Jan 17, 2008
1,186
168
i think you're using a lot of twisted logic to arrive at 4a. The cMP saw regular updates that reflected changes in the available technology at the time each model was introduced. I see nothing preventing Apple from upgrading the nMP accordingly. I see nothing in the nMP design which would prohibit the configuration of E5 v3 and E5 v4 processors. I see nothing in the nMP design which limits its SSD to 1TB. I see nothing in the nMP design limiting it to TB2. The only design limit of the nMP might be the GPUs. But then why not remove one and use the recovered space / power / thermal cooling for a single variant?

Nothing twisted about it. Just an opinion, TBH. And, I stated them to show or create a map in the mind of the reader of the changes that occurred in the cMP, so as to create an understanding, perhaps of Apple's intention, direction and ultimately, goal.

It shows that the cMP got tweaks and tweaks until eventually, the MP4,1 came along and the case was so spacious that the fan in the middle that blows air into the PCIe compartment are off-set from the front/grill of the case. This is surely a design/aesthetic element to fill in that cavernous space with "something," be it, a fan. And, since it is offset, why not make it serve dual roles, as a PCIe card holder, too, which it does.

An argument then can be made from this design decision that the cMP is too big, too cavernous that Apple had to off-set the fan from the grill and then assign a second role to it as a PCIe card holder.

And, so when Intel released appropriate technology in 2013, Apple miniaturized the Mac Pro into the new Mac Pro. No more empty space. And, it is smaller, and more powerful to boot.

Obviously, not everyone was a fan of the idea.

As to why it didn't receive regular tweaks as its older sibling, then I can only say that perhaps, appropriate parts for it weren't available and/or technology tweaks from 2013-2015 did not merit the same updates the cMP saw when it went from cMP1,1 to cMP2,1... or cMP4,1 to cMP5,1 where the performance envelope in the update did not really move the needle that much.

The year where the nMP could have been updated was 2016 when Broadwell Xeons were released. But, I don't think there was an appropriate GPU to pair the nMP with. Still, Apple could have updated the nMP with 2016 tech. So, basically, a Broadwell Xeon, Same D Series GPU's, but with TB3 this time.

For some reason, Apple didn't do this and it could be that that "Thermal corner" Apple person mentioned is a real corner to negotiate that it wasn't worth the effort to negotiate this corner for such a minor corner that is a Broadwell CPU and TB3 upgrade.

Apple instead decided they could tweak the iMac chasis, instead, in which to negotiate this corner.

Perhaps, Apple had a 2016 nMP in engineering sample ready for 2016 Broadwell CPU's and TB3. But, found corporate reasons to not go ahead with it. Or, perhaps, it was the Apple design group that decided to not mess up the 2013 nMP since the new, new 2016 Mac Pro of which they had a sample of in their lab was just a tiny bit larger (to negotiate that so called thermal corner) and Apple designers weren't in love with this idea. The idea that a 2016 new, new Mac Pro is different in size than the 2013 nMP.

Instead, they went with the iMac chasis where they were able to negotiate the thermal corner without changing external dimensions of the iMac. But, just changed its color to space gray just to differentiate it.
 

pl1984

Suspended
Oct 31, 2017
2,230
2,645
Nothing twisted about it. Just an opinion, TBH. And, I stated them to show or create a map in the mind of the reader of the changes that occurred in the cMP, so as to create an understanding, perhaps of Apple's intention, direction and ultimately, goal.
My head was spinning from reading that and your subsequent response, thus my characterization as twisted. There's nothing in the nMP design preventing it from receiving some updates. The fact Apple has chosen to do zero updates demonstrates even Apple considers it a failure. No amount of spin will change that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.