Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

do you agree?

  • yes

    Votes: 26 32.1%
  • no

    Votes: 45 55.6%
  • I like traps

    Votes: 10 12.3%

  • Total voters
    81

danielwsmithee

macrumors 65816
Mar 12, 2005
1,135
410
The only thing the AMD people and reviews are focusing on is gaming. They are forgetting that there is a whole word of computing out there such as virtualisation, an area where AMD never has been great compared to Intel (which isn't always there fault, some software simply don't support the AMDs). Saying that the Ryzen is the best for an iMac is just plain stupid as long as no one looks at all the other kinds of computing because one should not forget that the iMac has more uses than gaming alone. The jury is still out.
Dang dude talk about stupidity, spewing nonsense when you clearly haven't done any research or reading on the topic. AMD has gotten killed in the gaming reviews for Ryzen because everything run on it is showing that it is not a good chip for gaming, but is great for all the other things like content creation and virtualization. Your saying the exact opposite of reality.
 
  • Like
Reactions: Wuiffi and Malus120

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
But the CPU's used in the iMac's (the mainstream desktop Core i5/Core i7s) completed their rollout in January, so I fail to see how this is relevant.
It's not what everybody else seems to think though.

The issue in the reddit post you linked to in your previous post seems to have been resolved unless I'm misunderstanding it. And even some of the articles on the site you linked to seemed to imply that problems had already been worked out, or are quickly being worked out. Now, I'm not an expert in this field, so if you want to point us towards specific articles, that better highlight specific issues that'd be awesome.
They are not resolved at all. What they did is workaround the issue which seems to be working well but they are not sure of the entire impact of their fix which is why they are not recommending it. Nor does VMware, if it isn't on their HCL then there is absolutely no guarantee whatsoever (and usually it also means it simply doesn't work or doesn't work reliably/properly). The software needs to be updated but this is going to be a slow process because VMware shares code between its virtualisation products (about 90%) and they aren't the only ones. Also, AMD is not used that much in that area so there is very little reason to spend resources on development. This also applies to Xen/XenServer and others.

The other issue is running Linux as a guest OS which also seems to be not working. If it means that the guest OS also needs kernel 4.10 or higher then this can be very problematic as you cannot use older systems (anything like RHEL, CentOS is out of the question..both systems are used a lot).

That said. I think we're getting a bit off topic. While its' certainly educational, and definitely relevant to Ryzen's suitability for certain kinds of server use cases, I don't think the things you describe above, (and much of what you have linked to) are really relevant to 99.9% of iMac users, nor do I think they are things Apple is considering when designing consumer (or even Pro) Macs. Most people don't buy a Mac (particularly an iMac) so they can install server grade Linux or VMWare ESXi and turn it into a VM farm.
As stated above, the code used in ESXi is the same code used in VMware Fusion and Workstation. VMware Fusion is one of the main virtualisation products used on Macs (and this kind of software is used an awful lot!). So this really isn't an easy quick fix, there is much more to it because they have to do it to code that is used in various products.

The majority of users experience with virtualization on a Mac is limited to VMWare Fusion or Parallels, which should be easily adaptable to working with AMD's virtualization instructions. Even developers probably aren't running more than a few instances inside software like this on top of macOS, so I honestly fail to see how this would impact even a fraction of iMac users.
You couldn't be more wrong here. Virtualisation and containers (think Docker) play a major role in todays way of developing things. We have things like Vagrant which was created by a developer in order to make his development life simpler. It's now a company producing several pieces of development tooling. You are underestimating the importance of virtualisation and the use of iMacs in the development community.

The problem here is that you are taking things way too literal. It was merely an example that there are many more reasons why Apple is using Intel CPUs instead of AMD ones and why they cannot simply switch from one to the other. Apple has to account for a lot of different kinds of use cases because if they don't...well...you get what happened to the Mac Pro. In the end it's about having a computer that works and works reliably which means it won't have certain parts or the latest and greatest.

Dang dude talk about stupidity, spewing nonsense when you clearly haven't done any research or reading on the topic. AMD has gotten killed in the gaming reviews for Ryzen because everything run on it is showing that it is not a good chip for gaming, but is great for all the other things like content creation and virtualization. Your saying the exact opposite of reality.
Now you may argue that I didn't word it that well but do read the posts that followed instead of immediately hitting the reply button. You are missing a lot of context and thus the fact that I was actually referring to the part of the quote in bold : the fact that they all test for gaming because AMD is such a good gaming CPU (whether or not that is true doesn't matter). They don't go through Linux benchmarks, test if virtualisation works properly or some other piece of software that matters to people buying iMacs (which would be very relevant in this topic). Basically, the reviews are not useful for the discussion here.

Btw, the people who looked into virtualisation found the same thing as previously: you need to stick with Intel if you don't want to run into all sorts of issues. Which is to be expected since the virtualisation world has been build around Intel for ages.
 

yujun

macrumors newbie
Apr 18, 2017
2
2
Shanghai
A ryzen 1700 can be downclocked to 2,8ghz and draws only 35w , and its a 8core16threads CPU and it supports ecc ddr4 ram. Now i would like to see a macbook pro being really pro with using this chip and a GPU like downvolted RX550
 

Malus120

macrumors 6502a
Jun 28, 2002
679
1,412
It's not what everybody else seems to think though.


They are not resolved at all. What they did is workaround the issue which seems to be working well but they are not sure of the entire impact of their fix which is why they are not recommending it. Nor does VMware, if it isn't on their HCL then there is absolutely no guarantee whatsoever (and usually it also means it simply doesn't work or doesn't work reliably/properly). The software needs to be updated but this is going to be a slow process because VMware shares code between its virtualisation products (about 90%) and they aren't the only ones. Also, AMD is not used that much in that area so there is very little reason to spend resources on development. This also applies to Xen/XenServer and others.

The other issue is running Linux as a guest OS which also seems to be not working. If it means that the guest OS also needs kernel 4.10 or higher then this can be very problematic as you cannot use older systems (anything like RHEL, CentOS is out of the question..both systems are used a lot).


As stated above, the code used in ESXi is the same code used in VMware Fusion and Workstation. VMware Fusion is one of the main virtualisation products used on Macs (and this kind of software is used an awful lot!). So this really isn't an easy quick fix, there is much more to it because they have to do it to code that is used in various products.


You couldn't be more wrong here. Virtualisation and containers (think Docker) play a major role in todays way of developing things. We have things like Vagrant which was created by a developer in order to make his development life simpler. It's now a company producing several pieces of development tooling. You are underestimating the importance of virtualisation and the use of iMacs in the development community.

The problem here is that you are taking things way too literal. It was merely an example that there are many more reasons why Apple is using Intel CPUs instead of AMD ones and why they cannot simply switch from one to the other. Apple has to account for a lot of different kinds of use cases because if they don't...well...you get what happened to the Mac Pro. In the end it's about having a computer that works and works reliably which means it won't have certain parts or the latest and greatest.

I understand that virtualization is very important to developers (and lots of other users). I can also appreciate the need for stability (obviously). In the same way that you think I take your examples too literally, I feel like you extrapolate from very specific edge cases leading you to overestimate the actual challenges in getting macOS, Ryzen, and virtualization working well together inside Apple's relatively small eco system (Docker, for example, seems to work off of macOS's hypervisor framework, so as long as Apple makes sure the framework is compatible this should work).

I also think that Apple putting Ryzen in even part of its product lineup would put a lot of pressure on (Mac) virtualization providers to get things working right.
(Side note, given that they culled the majority of the development staff last year, if you use VMWare Fusion extensively on a mac, I think you should be concerned even if Apple sticks to exclusively Intel)

I would also argue that while many developers do use the iMac, that doesn't make developer its core demographic. Apple builds iMacs to appeal to a wide range of people in the consumer, prosumer, and pro spaces. The iMac is in a tool for everything from simple home use and office tasks, to creative work (photo/video/music), development, and science.

While I could definitely see some of the issues you mentioned being possible show stoppers in a future Mac Pro (again, if left resolved, which I personally feel is unlikely), I don't believe the same argument necessarily holds water for the iMac. As a general purpose machine, the benefits for the majority of use cases may well outweigh the (hopefully temporary) costs to a minority of highly specialized tasks.

That said, I think the area we can perhaps agree on is that if Apple does decide to offer Ryzen in a Mac, it will be (or at the very least should be) because they are confident that they can do it right. They will have been working closely with AMD on this for a while, and have made sure that for the vast majority of users, the underlying x86 CPU used is completely transparent.


Now you may argue that I didn't word it that well but do read the posts that followed instead of immediately hitting the reply button. You are missing a lot of context and thus the fact that I was actually referring to the part of the quote in bold : the fact that they all test for gaming because AMD is such a good gaming CPU (whether or not that is true doesn't matter). They don't go through Linux benchmarks, test if virtualisation works properly or some other piece of software that matters to people buying iMacs (which would be very relevant in this topic). Basically, the reviews are not useful for the discussion here.

I feel like the problem isn't so much you're "wording" as it is your complete dismissal of the vast majority of professional reviews as "gaming reviews" for a "gaming CPU." Just because they don't cater to your specific niche doesn't mean that most sites didn't include a vast amount of non gaming content (productivity and creative benchmarks in particular) that (along with the gaming content) is VERY relevant and "useful" to the discussion of what chip should go in the next iMacs.

If you honestly think Linux VM benchmarks are more important to the core iMac user base than benchmarks representing the creative fields and general use, I think you might just be a bit out of touch. I'm not saying there's not a large community of developers using iMacs, but I would suggest that the community might not be as large relative to the overall iMac install base as you think it is.

Btw, the people who looked into virtualisation found the same thing as previously: you need to stick with Intel if you don't want to run into all sorts of issues. Which is to be expected since the virtualisation world has been build around Intel for ages.
So basically we should stick with Intel because the virtualization providers are too lazy to ensure their code plays nice with anything other than Intel, thus perpetuating Intel's monopoly in the field...
 
  • Like
Reactions: jrichards1408

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
I feel like you extrapolate from very specific edge cases leading you to overestimate the actual challenges in getting macOS, Ryzen, and virtualization working well together inside Apple's relatively small eco system (Docker, for example, seems to work off of macOS's hypervisor framework, so as long as Apple makes sure the framework is compatible this should work).
That's because you are still taking things way too literal and thinking way too lightly about changing CPUs. I gave some example as to why it is too simplistic to think that you can simply put in a CPU from a different manufacturer even though you are using the same platform. There are far more cases that Apple has to account for and check if it works or not. One does not simply design a computer overnight which you seem to be thinking here.

The other problem here is that you lack actual knowledge which can be seen with the above quote concerning Docker. Apple has to work together with other parties on this to make sure things work. In this case it's also an issue with Docker for Mac because they have to make sure their product is working as well (in fact, Docker is Linux-only; what they do is spin up a very small vm running something like Alpine Linux with Docker on it and lots of proxies on it to pass networking and such between macOS, Docker and the containers).

I also think that Apple putting Ryzen in even part of its product lineup would put a lot of pressure on (Mac) virtualization providers to get things working right.
There are too many alternatives for those providers or differently put: Mac is still a niche compared to Windows.

(Side note, given that they culled the majority of the development staff last year, if you use VMWare Fusion extensively on a mac, I think you should be concerned even if Apple sticks to exclusively Intel)
You need to do your homework because that is false information, stop spreading nonsense like this please. They did NOT cull the majority of development staff. What VMware did is fire the GUI team for Fusion and Workstation AND moved that development to other teams, mainly in China. The entire team that develops the internal workings that do all the magic (aka the hypervisor) has NOT been fired at all and are still working on it! If VMware would fire that team they go bankrupt because that team is working on their core business.

Apple builds iMacs to appeal to a wide range of people in the consumer, prosumer, and pro spaces. The iMac is in a tool for everything from simple home use and office tasks, to creative work (photo/video/music), development, and science.
Which was my point. Most of the people here only see their own workload and think everyone does the same thing as they do. They don't. Apple has to account for an awful lot of different kind of workloads which is what makes things a lot more complex. Don't underestimate pro use of the iMac, Phil Schiller made a public statement that a lot of pro's are using the iMac now which has led to Apple now taking this model more seriously (the Pro version which they hinted on).

That said, I think the area we can perhaps agree on is that if Apple does decide to offer Ryzen in a Mac, it will be (or at the very least should be) because they are confident that they can do it right. They will have been working closely with AMD on this for a while, and have made sure that for the vast majority of users, the underlying x86 CPU used is completely transparent.
Exactly but that won't be with the current generation (which is Ryzen), but 1 or 2 generations after that. You are also forgetting the entire production and infrastructure thing here. Intel has an enormous capability of manufacturing lots of chips and transporting them. Is AMD capable of doing that so they can keep up with Apple's demands? If they can't and users are going to have to wait on their orders there will be lots of backlash.

I feel like the problem isn't so much you're "wording" as it is your complete dismissal of the vast majority of professional reviews as "gaming reviews" for a "gaming CPU." Just because they don't cater to your specific niche doesn't mean that most sites didn't include a vast amount of non gaming content (productivity and creative benchmarks in particular) that (along with the gaming content) is VERY relevant and "useful" to the discussion of what chip should go in the next iMacs.
That was my point concerning gaming: just because the reviewer loves gaming doesn't mean that this should be the primary target of the review with the majority of the benchmarks being gaming benchmarks. I'm missing an overall view of the CPU, if you want that you have to combine several reviews.

If you honestly think Linux VM benchmarks are more important to the core iMac user base than benchmarks representing the creative fields and general use, I think you might just be a bit out of touch. I'm not saying there's not a large community of developers using iMacs, but I would suggest that the community might not be as large relative to the overall iMac install base as you think it is.
Can you stop taking things 100% literal and start thinking about what someone is saying? It is rather annoying that I have to spell things out 1 by 1 before you grasp the concept. How many times do I need to tell you that these are some examples of other workloads that you are so easily dismissing but Apple can't because they want to take their pro users seriously? Why do reviews have to be so biased and can't they not be more generic so anyone can get a good view of what something does/performs?

So basically we should stick with Intel because the virtualization providers are too lazy to ensure their code plays nice with anything other than Intel, thus perpetuating Intel's monopoly in the field...
This is something an AMD fanboy would say. What makes you think the issue is with the hypervisors and not with AMD? Let me remind you of software such as IE6. Just because the site didn't work in that browser doesn't mean the developer of that site was lazy. In this case it was IE6 making an utter mess of web standards. Unless you absolutely and fully understand hypervisors you can't go and point your finger at anyone. Endusers won't do any of that either, for them it is a simple matter of "does it work or not". Hypervisors use a lot of hardware acceleration which in turn means several instruction sets in the CPU. If AMD can't offer something similar or anything that makes virtualisation equally or more efficient as on Intel hardware then this is clearly going to be an issue. Not for the hypervisor manufacturers but for the end users who in turn will be looking for non-AMD hardware.

Before you reply please take a moment and THINK about what I'm writing here. It is not rocket science but it isn't a walk in the park either. I would love to see Apple also using AMD CPUs, even a mix of Intel and AMD CPUs, but let's be realistic here. It's the same with those ARM rumours.
 

Malus120

macrumors 6502a
Jun 28, 2002
679
1,412
That's because you are still taking things way too literal and thinking way too lightly about changing CPUs. I gave some example as to why it is too simplistic to think that you can simply put in a CPU from a different manufacturer even though you are using the same platform. There are far more cases that Apple has to account for and check if it works or not. One does not simply design a computer overnight which you seem to be thinking here.

I'm sorry but I have to disagree. You're the one being too literal. You extrapolate a few edge case examples (some of which aren't even from macOS and would mostly apply to enterprise class hardware) to try and make it look as if the sky is falling regarding compatibility between different x86-64 CPUs.

Is it a drop in update if you want everything to work perfectly? Of course not. But the fact that you can run Windows 7, an OS which stopped receiving support for new CPUs a long time ago, on Ryzen with relatively few issues speaks to the broader compatibility of CPUs in the x86 family.

Oh, and where did I say or imply that Apple would just "design a computer overnight?" If (and it's certainly still an if) Ryzen ends up in the next iMacs, it will be because Apple has been working closely with AMD since well before the commercial release of Ryzen. Given the obvious and increasingly close working relationship between Apple and AMD on the GPU side, and AMD's penchant for developing custom solutions for partners, I don't think this is at all outside the realm of possibility.

You need to do your homework because that is false information, stop spreading nonsense like this please. They did NOT cull the majority of development staff. What VMware did is fire the GUI team for Fusion and Workstation AND moved that development to other teams, mainly in China. The entire team that develops the internal workings that do all the magic (aka the hypervisor) has NOT been fired at all and are still working on it! If VMware would fire that team they go bankrupt because that team is working on their core business.

Of course they didn't lay off the hypervisor development at the core of their business. Still, if you don't think their decision to fire and outsource a large portion of the fusion team (even if it is "only" GUI) is a potential cause for concern, I think you're fooling yourself. Maybe I'm wrong, and the outsourced teams will do an even better job, but I've worked in IT, and I've seen the damage that can be done when even a portion of development is outsourced.

Which was my point. Most of the people here only see their own workload and think everyone does the same thing as they do. They don't. Apple has to account for an awful lot of different kind of workloads which is what makes things a lot more complex. Don't underestimate pro use of the iMac, Phil Schiller made a public statement that a lot of pro's are using the iMac now which has led to Apple now taking this model more seriously (the Pro version which they hinted on).

Not to be rude, but this feels like a case of the pot calling the kettle black. You're acting like VM based development and server usage are the ONLY TRUE PRO workloads, and that people working in photography, video, music, design, advertising, CAD, scientific applications, etc aren't pros, and that what would be good for their use cases doesn't matter. I'm not underestimating pro use of the iMac, if anything I feel like you're underestimating what constitutes "pro" usage, and what use cases and users really drove Apple to recommit to the Mac Pro.

Exactly but that won't be with the current generation (which is Ryzen), but 1 or 2 generations after that. You are also forgetting the entire production and infrastructure thing here. Intel has an enormous capability of manufacturing lots of chips and transporting them. Is AMD capable of doing that so they can keep up with Apple's demands? If they can't and users are going to have to wait on their orders there will be lots of backlash.

Again, Apple choosing to use AMD would be predicated on Apple having an understanding with AMD built over the development lifecycle of the Ryzen chip. That said, given the (relatively small) volumes Apple ships in (not to mention Apple's clout with the major chip foundries), I don't think AMD would have a problem keeping up manufacturing and supply wise.

That was my point concerning gaming: just because the reviewer loves gaming doesn't mean that this should be the primary target of the review with the majority of the benchmarks being gaming benchmarks. I'm missing an overall view of the CPU, if you want that you have to combine several reviews.
This is just not true. Many reviews (for example, TechPowerUp, Tweaktown) were split fairly evenly between general purpose computing, creative application use (encoding/rendering), and gaming while others such as Anandtech, didn't even feature gaming in their initial review. Now, its true, many reviews did have a bit of a blindspot in regards to complex virtualization, but just because their missing one niche doesn't mean they don't offer a good overall view of the CPU.

Can you stop taking things 100% literal and start thinking about what someone is saying? It is rather annoying that I have to spell things out 1 by 1 before you grasp the concept. How many times do I need to tell you that these are some examples of other workloads that you are so easily dismissing but Apple can't because they want to take their pro users seriously? Why do reviews have to be so biased and can't they not be more generic so anyone can get a good view of what something does/performs?

I have tried to be thoughtful of what you're saying. You obviously have some in depth knowledge of development and virtual machine dependent workloads, and I respect that. That said, Pro ≠ VM dependent development. Most of the pro users clamoring for a redesigned Mac Pro (and a beefed up iMac) are creative professionals working in fields that can be significantly accelerated with access to faster GPUs (hence the angst over the lack of access to Nvidia's Maxwell and Pascal architectures, CUDA, and anything from AMD faster than a 7970).

This is something an AMD fanboy would say. What makes you think the issue is with the hypervisors and not with AMD? Let me remind you of software such as IE6. Just because the site didn't work in that browser doesn't mean the developer of that site was lazy. In this case it was IE6 making an utter mess of web standards. Unless you absolutely and fully understand hypervisors you can't go and point your finger at anyone. Endusers won't do any of that either, for them it is a simple matter of "does it work or not". Hypervisors use a lot of hardware acceleration which in turn means several instruction sets in the CPU. If AMD can't offer something similar or anything that makes virtualisation equally or more efficient as on Intel hardware then this is clearly going to be an issue. Not for the hypervisor manufacturers but for the end users who in turn will be looking for non-AMD hardware.

I'm so glad you brought up IE6. It's such a great example of EXACTLY what I was talking about. As you said, IE6 was a terrible browser that made a mess of web standards. And yet, because of its large, dominant marketshare, many websites just lazily tuned their code specifically for IE6, wrecking havoc (and often just making those sites inoperable) on all of the browsers that were actually standards compliant and light years ahead of IE6. The funny thing is, this is actually still a problem in some countries (like South Korea), where a lot of websites will still, literally not work properly in any browser that's not Internet Explorer.

Now, let me be clear. I'm not saying that's (necessarily) what's happening with hypervisors today. It certainly COULD be a problem with AMD's implementation. I'm simply arguing that its not outside the realm of possibility that a decade of increasing dominance by Intel has led to a situation where a lot of hypervisor code has become increasingly tuned to the peculiarities of Intel's implementation of x86-64 to the detriment of AMD. IF that is the case, then yes, it is on those hypervisor developers to actually take the time to write some vendor agnostic x86-64 code (just as the onus was on web developers to begin write browser agnostic, standards complaint code).

Before you reply please take a moment and THINK about what I'm writing here. It is not rocket science but it isn't a walk in the park either. I would love to see Apple also using AMD CPUs, even a mix of Intel and AMD CPUs, but let's be realistic here. It's the same with those ARM rumours.

I have taken (way too much) time to think about what you wrote. As I said before you do bring up some valid points about certain use cases. But when you end your post by saying that Apple using AMD CPUs is "the same with those ARM rumors," that's just flat out spreading misinformation. The fact is, you can get macOS working on AMD CPUs (including Ryzen) TODAY, and if some hackers with free time on their hands can do it, there's no reason why Apple can't do a way better job. If you'd like to point me to full fat macOS with full binary compatibility working on ARM hardware be my guest, but I'm not holding my breath.
 
Last edited:

Zeblade

macrumors newbie
Mar 12, 2010
2
0
1st.. I am no one.. don't know anything lol. Pffft. when you get into bed with someone.. its not that easy to get out. Microsoft already said there is a problem with Windows and RYZEN. Gee.. could that have some effect on single core?

Yes yes yes.. blah blah blah.. were all experts on INTEL & AMD. We've had INTEL for how long now? DUH nothing is better lol. No way are some going to see anything other than Intel. lol Intel just sat back and made little changes year after year and ITS OK! We say nothing.. we dont expect anything for our money. I digress.

Lol...RYZEN just came out and some of you know.. Intel just sat up straight.
 

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
Oh, and where did I say or imply that Apple would just "design a computer overnight?"
In each and every one of your replies in this topic. Things aren't as simply as you think (see below for a further explanation).

Of course they didn't lay off the hypervisor development at the core of their business. Still, if you don't think their decision to fire and outsource a large portion of the fusion team (even if it is "only" GUI) is a potential cause for concern, I think you're fooling yourself. Maybe I'm wrong, and the outsourced teams will do an even better job, but I've worked in IT, and I've seen the damage that can be done when even a portion of development is outsourced.
It's not about wrong or right but about knowing the details and acting on facts instead of emotion. You are acting purely on emotion and overreacting on the matter. What you are forgetting here is that 90% of the code they write for Fusion and Workstation didn't come from the team that got replaced but from their core developers that didn't go anywhere. You are also forgetting that the virtualisation landscape changed considerably, we now have a fierce competition.

Basically that means 2 things: VMware can't afford to mess things up and when they do mess things up there are many alternatives. Therefore there is no reason whatsoever to be concerned. We've seen it in reality too with lots of people jumping ship between the various products. Not to mention that people/companies just buy something and stick with it.

Not to be rude, but this feels like a case of the pot calling the kettle black. You're acting like VM based development and server usage are the ONLY TRUE PRO workloads, and that people working in photography, video, music, design, advertising, CAD, scientific applications, etc aren't pros, and that what would be good for their use cases doesn't matter. I'm not underestimating pro use of the iMac, if anything I feel like you're underestimating what constitutes "pro" usage, and what use cases and users really drove Apple to recommit to the Mac Pro.
Not to be rude but you suck at discussions. You fail to understand the basic principles such as what examples are, why people use them, why you never ever should take things literal as well as to summarise your points and don't write down all of the possible items in a list. In this case you even can't do that because the list of workloads pros have is endless. That's why you pick 1 or 2 examples and use that.

This is not about being 100% correct by listing every possibility known to man. A discussion is about getting your point across and for that you really do not need 100% correctness, 100% accuracy and 10.000 words per reply.

Learn the rules of discussion and abide by them, that way people won't get annoyed or offended by you.

That said, given the (relatively small) volumes Apple ships in (not to mention Apple's clout with the major chip foundries), I don't think AMD would have a problem keeping up manufacturing and supply wise.
Unfortunately the facts paint a different picture.

Many reviews (for example, TechPowerUp, Tweaktown) were split fairly evenly between general purpose computing, creative application use (encoding/rendering), and gaming while others such as Anandtech, didn't even feature gaming in their initial review.
I wouldn't call that "evenly". It mostly is listing benchmarks and other theoretical stuff. They don't go explore various workloads.

I have tried to be thoughtful of what you're saying. You obviously have some in depth knowledge of development and virtual machine dependent workloads, and I respect that.
I'm sorry to say to that you really did no such thing and you still aren't doing it because you are so incredibly pinned down on virtualisation and development. I don't know how many times I have to say it but these are just 2 examples of workloads people use iMacs for.

Most of the pro users clamoring for a redesigned Mac Pro (and a beefed up iMac) are creative professionals working in fields that can be significantly accelerated with access to faster GPUs (hence the angst over the lack of access to Nvidia's Maxwell and Pascal architectures, CUDA, and anything from AMD faster than a 7970).
That's not true and is a complete disservice to pros in general. Apple has been aiming at those creative professionals and from sales figures alone one can tell that they did a great job. They didn't do a great job for all the other kinds of professionals as Phil and Craig have described in the interview. The issue isn't faster GPUs but a whole lot more. For most of the users here, the only thing that matters is upgradability. Funnily the older Mac Pros never really were very upgradable to begin with. Yes you could insert a PCIe card but it was always a big question if it would actually work in OS X and if it did, if it would work properly. Even back then people were complaining about upgradability so that discussion has never been new.

Now, let me be clear. I'm not saying that's (necessarily) what's happening with hypervisors today.
Actually you did say that, you didn't even think about the fact that there could be other possibilities, let alone multiple ones. That's why I mentioned IE6.

But when you end your post by saying that Apple using AMD CPUs is "the same with those ARM rumors," that's just flat out spreading misinformation.
There is no misinformation, you simply didn't understand it. Just because you fail to understand something and/or someone doesn't make it misinformation. In fact, what I said there is an undeniable given fact. For a switch to a different manufacturer or even a completely different architecture that manufacturer or architecture needs to have benefits over what they are currently using. Or in other words: why would Apple switch to AMD or ARM (for their Macs) when their current partner can do the same or something similar? We are not just talking performance here, it's about other things like contracts, relationship, production, supply, pricing and probably a plethora of other things I'm not mentioning here. It is perfectly possible for Apple to switch to AMD as well as ARM (for their Macs) but it isn't very likely that they do.

The fact is, you can get macOS working on AMD CPUs (including Ryzen) TODAY, and if some hackers with free time on their hands can do it, there's no reason why Apple can't do a way better job. If you'd like to point me to full fat macOS with full binary compatibility working on ARM hardware be my guest, but I'm not holding my breath.
It isn't about getting macOS working on it ;) They probably already have it working in their labs. It isn't uncommon that manufacturers do things like that. Microsoft does so too. It actually is vital to do research into these things. That research doesn't start and end with getting the OS running on the hardware, it's also about researching what the hardware can and cannot be used for. By doing research like that you gain knowledge about it which helps you in deciding whether you should use it or not and if you are going to use it, for what.
 

cynics

macrumors G4
Jan 8, 2012
11,959
2,155
Just to touch on AMD working with MacOS I'm very confident Apple has already done it. During the PowerPC days they had OS X working on Intel. I think that is a non issue.

It's coming along rather well with Linux Kernels.
 

Malus120

macrumors 6502a
Jun 28, 2002
679
1,412
In each and every one of your replies in this topic. Things aren't as simply as you think (see below for a further explanation).

Thanks for putting words into my mouth. Not what I said, but I guess when all you have is a hammer, everything looks like a nail.

It's not about wrong or right but about knowing the details and acting on facts instead of emotion. You are acting purely on emotion and overreacting on the matter. What you are forgetting here is that 90% of the code they write for Fusion and Workstation didn't come from the team that got replaced but from their core developers that didn't go anywhere. You are also forgetting that the virtualisation landscape changed considerably, we now have a fierce competition.

Basically that means 2 things: VMware can't afford to mess things up and when they do mess things up there are many alternatives. Therefore there is no reason whatsoever to be concerned. We've seen it in reality too with lots of people jumping ship between the various products. Not to mention that people/companies just buy something and stick with it.

I've worked with VMWare in a professional capacity (at a partner firm, not client), so as I said before I'm well aware of both the effort they put into their enterprise products, and the degree to which the virtualization landscape has become competitive (it certainly made my job more interesting). My concern for VMWare Fusion isn't about emotion, its about experience. Working at a large firm that develops enterprise hardware and software, I've seen the challenges that outsourcing something as simple as even the GUI can bring. Am I saying VMWare Fusion is done for? Of course not, but last years update was certainly fairly minor compared to their usual annual updates, and that doesn't inspire a lot of confidence. Still this is just my personal opinion.

Not to be rude but you suck at discussions. You fail to understand the basic principles such as what examples are, why people use them, why you never ever should take things literal as well as to summarise your points and don't write down all of the possible items in a list. In this case you even can't do that because the list of workloads pros have is endless. That's why you pick 1 or 2 examples and use that.

This is not about being 100% correct by listing every possibility known to man. A discussion is about getting your point across and for that you really do not need 100% correctness, 100% accuracy and 10.000 words per reply.

Learn the rules of discussion and abide by them, that way people won't get annoyed or offended by you.

Someone who isn't rude, and understands how to have a civilized debate wouldn't begin a sentence with "not to be rude but you suck at discussions,...." Perhaps you should reflect on your own behavior before criticizing others. I've tried to be clear, if not concise, in explaining my position without attacking you, if you can't do that then I'm not sure we have anything more to discuss.

I wouldn't call that "evenly". It mostly is listing benchmarks and other theoretical stuff. They don't go explore various workloads.

As I've said, I can appreciate that we have a difference of opinion here.

I'm sorry to say to that you really did no such thing and you still aren't doing it because you are so incredibly pinned down on virtualisation and development. I don't know how many times I have to say it but these are just 2 examples of workloads people use iMacs for.

That's what I've been saying, so thanks for admitting I'm right. If you'd like to point to any other workloads that have issues on Ryzen, be my guest.

That's not true and is a complete disservice to pros in general. Apple has been aiming at those creative professionals and from sales figures alone one can tell that they did a great job. They didn't do a great job for all the other kinds of professionals as Phil and Craig have described in the interview. The issue isn't faster GPUs but a whole lot more. For most of the users here, the only thing that matters is upgradability. Funnily the older Mac Pros never really were very upgradable to begin with. Yes you could insert a PCIe card but it was always a big question if it would actually work in OS X and if it did, if it would work properly. Even back then people were complaining about upgradability so that discussion has never been new.

I don't know if you just misunderstood what was said, are deliberately exaggerating what was said, or didn't actually bother to read the transcript. Whichever it is, your just flat out wrong on this one. Here's a link to the full transcript
(https://techcrunch.com/2017/04/06/t...-john-ternus-on-the-state-of-apples-pro-macs/), but here are some choice (abridged) quotes.

"John Ternus: I think one of the foundations of that system was the dual GPU architecture. And for certain workflows, certain classes of pro customers, that’s a great solution. But....The way the system is architected, it just doesn’t lend itself to significant reconfiguration for somebody who might want a different combination of GPUs. That’s when we realized we had to take a step back and completely re-architect what we’re doing and build something that enables us to do these quick, regular updates and keep it current and keep it state of the art, and also allow a little more in terms of adaptability to the different needs of the different pro customers. For certain classes a single bigger GPU would actually make more sense."

"Craig Federighi: I think we designed ourselves into a bit of a thermal corner, if you will. We designed a system that we thought with the kind of GPUs that at the time we thought we needed, and that we thought we could well serve with a two GPU architecture… that that was the thermal limit we needed, or the thermal capacity we needed. But workloads didn’t materialize to fit that as broadly as we hoped. Being able to put larger single GPUs required a different system architecture and more thermal capacity than that system was designed to accommodate. So it became fairly difficult to adjust."

Those are just two quotes, but the word GPU comes up 22 times during the transcript, and its clear if you read it in its entirety, that the ability to accommodate faster (hotter) single GPUs was the primary factor behind the redesign.
(Note: none of this is to say they didn't call out software developers as one of their fastest growing and possibly largest Pro groups, or that this group isn't important to them. I'm simply pointing out that this group was not the key group prompting them to rethink the nMP)

Actually you did say that, you didn't even think about the fact that there could be other possibilities, let alone multiple ones. That's why I mentioned IE6.

I like how you failed to think through the logic of an example you yourself threw out there, and then, when I poked holes in it, you just revert to attacking me. Classy.

There is no misinformation, you simply didn't understand it. Just because you fail to understand something and/or someone doesn't make it misinformation. In fact, what I said there is an undeniable given fact. For a switch to a different manufacturer or even a completely different architecture that manufacturer or architecture needs to have benefits over what they are currently using. Or in other words: why would Apple switch to AMD or ARM (for their Macs) when their current partner can do the same or something similar? We are not just talking performance here, it's about other things like contracts, relationship, production, supply, pricing and probably a plethora of other things I'm not mentioning here. It is perfectly possible for Apple to switch to AMD as well as ARM (for their Macs) but it isn't very likely that they do.

No, you're obviously the one who doesn't understand. If you can't understand how switching between Intel and AMD (both x86-64) and how switching between x86-64 and ARM (completely different architectures) is different then I'm not sure we can have a productive discussion until you educate yourself. Switching to ARM would be a monumentally more difficult task (not to mention much harder to reverse) for Apple than deciding to utilize AMD CPUs in their machines. The benefits of what AMD is offering have already been discussed. I'm not going to rehash them for you again here. That said, Apple regularly switched between Motorola and IBM back in the PowerPC days (with no adverse effects I might add), so IMO, you're opinion on the great difficulty of using different manufacturers implementations of the same fundamental architecture just doesn't hold water.

It isn't about getting macOS working on it ;) They probably already have it working in their labs. It isn't uncommon that manufacturers do things like that. Microsoft does so too. It actually is vital to do research into these things. That research doesn't start and end with getting the OS running on the hardware, it's also about researching what the hardware can and cannot be used for. By doing research like that you gain knowledge about it which helps you in deciding whether you should use it or not and if you are going to use it, for what.
One of the few things I think we can (mostly) agree on, although based on the rest of what you wrote, you're logical process in reaching this conclusion is very different from my own.

I appreciate the contributions you've made to this thread, and I've enjoyed having this discussion with you.That said, I honestly don't think we're going to be able to see eye to eye on this. I think we've both said what we wanted to say, although If you want to stop making personal attacks, I'd be happy to discuss this issue further with you.

But that's probably not what you want, and unless you demonstrate otherwise, I think it's time we stop filling this thread with essay length replies.
 
Last edited:

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
Thanks for putting words into my mouth. Not what I said, but I guess when all you have is a hammer, everything looks like a nail.
Hey, you started it.

Still this is just my personal opinion.
Which is fine but that doesn't mean others have to agree with you on it. I'm not to worried because I know that we engineers are perfectly capable in moving from one piece of software to another. After all, it's what we do, it's our job.

Someone who isn't rude, and understands how to have a civilized debate wouldn't begin a sentence with "not to be rude but you suck at discussions,...." Perhaps you should reflect on your own behavior before criticizing others. I've tried to be clear, if not concise, in explaining my position without attacking you, if you can't do that then I'm not sure we have anything more to discuss.
Practice what you preach. I only gave you a piece of your own pie to make you understand that even though you claimed not to be rude you actually were rude. That usually is the case when people are using "not to be rude". So now you know how it feels I'm sure you refrain from doing it again.

I don't know if you just misunderstood what was said, are deliberately exaggerating what was said, or didn't actually bother to read the transcript. Whichever it is, your just flat out wrong on this one. Here's a link to the full transcript
(https://techcrunch.com/2017/04/06/t...-john-ternus-on-the-state-of-apples-pro-macs/), but here are some choice (abridged) quotes.
Those quotes show that I'm correct and you are flat out wrong. I'll highlight what the quotes are actually about:

"John Ternus: I think one of the foundations of that system was the dual GPU architecture. And for certain workflows, certain classes of pro customers, that’s a great solution. But....The way the system is architected, it just doesn’t lend itself to significant reconfiguration for somebody who might want a different combination of GPUs. That’s when we realized we had to take a step back and completely re-architect what we’re doing and build something that enables us to do these quick, regular updates and keep it current and keep it state of the art, and also allow a little more in terms of adaptability to the different needs of the different pro customers. For certain classes a single bigger GPU would actually make more sense."
The reason why they said that is because not every workflow benefits from dual GPU setups and for those there is a problem because the Mac Pro is a dual GPU config only. You can't put in a different GPU that has power but stays within the same thermal envelope as the dual GPU configurations. That's what many professionals were complaining about and why the iMac outperformed the Mac Pro in those areas (which caused many to buy the iMac instead).

"Craig Federighi: I think we designed ourselves into a bit of a thermal corner, if you will. We designed a system that we thought with the kind of GPUs that at the time we thought we needed, and that we thought we could well serve with a two GPU architecture… that that was the thermal limit we needed, or the thermal capacity we needed. But workloads didn’t materialize to fit that as broadly as we hoped. Being able to put larger single GPUs required a different system architecture and more thermal capacity than that system was designed to accommodate. So it became fairly difficult to adjust."

What you've highlighted here stresses that thermal envelope point.

Those are just two quotes, but the word GPU comes up 22 times during the transcript, and its clear if you read it in its entirety, that the ability to accommodate faster (hotter) single GPUs was the primary factor behind the redesign.
(Note: none of this is to say they didn't call out software developers as one of their fastest growing and possibly largest Pro groups, or that this group isn't important to them. I'm simply pointing out that this group was not the key group prompting them to rethink the nMP)
It was what I was referring to but you don't seem to understand the reasoning behind it, the technical part of it. It means that the Mac Pro had the power for workflows that took great benefit from dual GPUs but it lacked the power for the workflows that didn't. The people with those workflows were unhappy because unlike with the previous model they couldn't configure a higher end card to replace the dual GPU (which fits in the same thermal envelope).

The dual GPUs are what you want to have when you have a workflow with a lot of parallelism. Some pro users have that, many do not. The iMac matched the requirements of those people better and thus many started buying that one.

I like how you failed to think through the logic of an example you yourself threw out there, and then, when I poked holes in it, you just revert to attacking me. Classy.
The only one doing that is you and you did so from the very first reply.

No, you're obviously the one who doesn't understand. If you can't understand how switching between Intel and AMD (both x86-64) and how switching between x86-64 and ARM (completely different architectures) is different then I'm not sure we can have a productive discussion until you educate yourself.
What part of the "We are not just talking performance here, it's about other things like contracts, relationship, production, supply, pricing and probably a plethora of other things I'm not mentioning here." did you not understand? And what part of @Zeblade reply did you not understand?

The problem isn't the technology itself because, unlike what you think, it is not that difficult to switch from one architecture to another. Apple has experience with it and have shown that they can do it quickly. The real problem is everything that has got nothing to do with technology: it is the relationship, the legal stuff and so on that is the issue here.

I'll repeat again: it isn't about technology, it's about all the other stuff.

That said, Apple regularly switched between Motorola and IBM back in the PowerPC days (with no adverse effects I might add), so IMO, you're opinion on the great difficulty of using different manufacturers implementations of the same fundamental architecture just doesn't hold water.
Talking about putting words into someones mouth. I said the exact opposite and I've repeated it above but I'll do it again here: the technical part is peanuts, Apple has done it before and they showed they can do it quickly (the PowerPC switches you mentioned here but the biggest one you didn't: the switch from ppc to x86/x86_64). The problem isn't technical (which almost never is) but it is contracts, relationship, legal, etc.

Again, why would Apple give up their relationship with Intel or add AMD as a secondary relationship? What benefits do they get from that?

If you want to stop making personal attacks, I'd be happy to discuss this issue further with you.
There is no such thing as a discussion with you, all I did was explaining my first reply and it seems I still have to do that. If you want a discussion then please start with learning how to take part in one and stop trolling and stop the continuous personal attacks.
 

0990848

Cancelled
Original poster
Mar 31, 2015
113
74
Core for core, thread for thread comparison of Ryzen vs Kabylake. This is why Apple should stay far away from Ryzen.

 

Malus120

macrumors 6502a
Jun 28, 2002
679
1,412
Core for core, thread for thread comparison of Ryzen vs Kabylake. This is why Apple should stay far away from Ryzen.


Uhhhhh, no offense, but did you actually watch the video you posted all the way through?

Around the 5 minute mark he begins talking about what happens when you overclock the 1500x to 4Ghz across all cores (bringing it up to a similar clock speed as the 7700K, and creating a fairer core for core, thread for thread comparison), and he pointedly notes that at that point the 1500x basically offers >95% of the 7700K's performance for around 3/5 the price. Heck, in the conclusion, the reviewer actually recommends the 1500X over the 7700K so I don't know what exactly you're trying to prove here.

Of course, this video is just restating what all the other reviews have already noted which is: While Kaby Lake retains a slight IPC advantage (~5%), this only really matters when there is a large frequency delta (as is the case with the 7700K's 4.5Ghz boost frequency or ~5Ghz Overclocks vs a stock 1500x), and Ryzen either doesn't have (in the case of the 1500x) or can't take advantage of additional cores (1600/1700/1800.)

Perhaps it'd be better if you just came out and said what you wanted to say (or at least linked to an article saying it) instead of asking everyone to watch a 7 minute video (while I may or may not agree with you, I don't need "video evidence," either way)

For example, if you prefer an ultra high frequency (4.2~4.5Ghz) 4 core 8 thread chip vs an 8 core 16 thread chip with lower frequencies (3.3-4.0Ghz), that's a perfectly valid preference (and for some workloads it makes sense), albeit not one I share.
 

bigtomato

macrumors regular
Feb 28, 2015
210
156
AMD is too late to the game for apple to switch....they don't make cpu's that will run in less power efficient modes unfortunately and according to rumours that won't happen with AMD til second half of year. Apple also gets some chips from intel for their iPhone so no way AMD can offer what intel offers. Ask yourself whether you really need a MacBook? All apps are being created for iOS so makes sense to buy a windows machine hp x360 15" with pen / tablet support and just have an iPad around for your existing apps...macbook is dead !!!
 

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
Around the 5 minute mark he begins talking about what happens when you overclock the 1500x to 4Ghz across all cores...
Now re-read what you just typed there and then think of any computer sold by Apple that was overclocked. Do the same for companies like Lenovo, Dell and HP. Don't worry if you can't think of any because none of them sold anything like that. They are not very likely to ever going to sell overclocked machines either because their customers either want cheap or reliable.

Heck, in the conclusion, the reviewer actually recommends the 1500X over the 7700K so I don't know what exactly you're trying to prove here.
The fact that you don't have to go through all sorts of trickery to get a certain performance. The only strong argument against him posting that video is from @cube. For a CPU costing 3 times as much you'd expect a whole lot more (and luckily that's also what you are getting).

Be realistic about it.
 

cube

Suspended
May 10, 2004
17,011
4,972
The video is about the 7700, not the 7700K, which is quite a good match for an easy 1500X overclock.

So $305 instead of $340. Still over 50% more.
 

Malus120

macrumors 6502a
Jun 28, 2002
679
1,412
Now re-read what you just typed there and then think of any computer sold by Apple that was overclocked. Do the same for companies like Lenovo, Dell and HP. Don't worry if you can't think of any because none of them sold anything like that. They are not very likely to ever going to sell overclocked machines either because their customers either want cheap or reliable.

No need for me to reread anything. I know exactly what I wrote there. If you (re)read the rest of my post, I was in no way implying that a 1500X at stock is equal to a 7700K at stock, nor that Apple can, should or will utilze non stock frequencies for CPUs in their products. I would in no way recommend that Apple replace the 6700K in its current lineup with a 1500X. I believe Apple should only put Ryzen in the iMac's if they can at least get a 1600 in there (and a 1700 would be preferable).

The post to which I was responding linked to a video (the conclusion of which didn't back him up) and then stated
"Core for core, thread for thread comparison of Ryzen vs Kabylake. This is why Apple should stay far away from Ryzen."

I was merely pointing out that when you you reduce the frequency discrepancy (to create a true "core for core, thread for thread" comparison), Ryzen actually does just fine (as the reviewer from the video points out!)

Now, I can already imagine you're response, "but Ryzen doesn't come in a 4 core/8 Thread 4Ghz variant so that's irrelevant in a comparison vs the 7700K at stock clocks." Luckily you've already posted a great lead in to why I think it's relevant.

The fact that you don't have to go through all sorts of trickery to get a certain performance. The only strong argument against him posting that video is from @cube. For a CPU costing 3 times as much you'd expect a whole lot more (and luckily that's also what you are getting).

Be realistic about it.

As you yourself note, depending on the region, the list price for the 7700K is almost twice (or more) as expensive as the 1500X. So yeah, it' damn well better offer better performance (at stock clocks). Does that make Ryzen irrelevant for those who want the best performance? No. It simply means that the 1500X isn't "that" chip in AMD's lineup.

The 1500X obviously isn't targeted directly at Intel's top "consumer" chip (although the fact that it's gets as close as it does says a lot), and its pricing reflects that.

The correct comparison would be to look at how it performs compared to the i5 6400/6500 (currently used in the stock configuration 27" iMacs) and the i5 7400/7500 (the theoretical replacements for those stock chips), as the i5 7400 features the closest list price to the 1500X. Most reviews show the 1500X beating both the i5 7400 and 7500 handily in heavily threaded tests, and being about equal in less heavily threaded tasks.

If you want 7700K crushing performance with Ryzen, you go with the 1600/1600X or 1700/1700X. Ryzen's real strength (and obviously its design focus) is being able to offer more core/threads than the competition while offering a similar level of IPC and a similar TDP.

While I think the 1500X would be a passable choice for the base iMac, the whole reason Ryzen is exciting is that it could finally free us from the 4 core/8 thread mediocrity Intel's been shoving down the desktop market's throat for the last 5 years.

After year after year of ~5-7% performance gains, the possibility of an iMac with 6 or even 8 cores, that is anywhere between 25-80% faster than its predecessor (with the 6700K) is where the excitement over Ryzen comes from.
 
Last edited:

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
Do you want us to call the paramedics? Don't get all worked up over a video that is showing the obvious still applies: the more expensive CPU is faster than the wee bit lower specced and cheaper one which can only match the more expensive ones performance by overclocking. No need to type an entire book about it.
 

cube

Suspended
May 10, 2004
17,011
4,972
If it's baked in, I hope Ryzen 3 will have SMT.

It would be very interesting to see what a 1 CCX 4C/8T 8MiB L3 1300X can do.

I imagine it would not be able to sustain as high clock as the 1400, or it would probably have to be called 1400X (or the 1400 would have to be discontinued).
 

Chase R

macrumors 65816
May 8, 2008
1,279
81
PDX
People here are getting hung up on core-to-core count comparisons, which is a mistake, as it is not an apples to apples comparison. Proper comparison should be on a wattage and $ basis only. Core count is irrelevant. For example, PC users don't care if a computation went through 2 cores or 32 cores, they care about how fast it got there and what it took (or cost) to get there. For comparison purposes, we should be thinking of CPUs as black boxes, and only compare them on the basis of what we can see externally, which for all intensive purposes is wattage and price.

As an example, consider the 1600X vs the 7600K. They are each's cheapest ~90-95W processor and around the same ~$250 price point. The 1600X destroys the 7600K. Don't fall for the "well it has more cores!" trap.

Other proper comparisons would be 1700X vs 7700K and 1800X vs 5820K/6800K
 
Last edited:

Sheza

macrumors 68020
Aug 14, 2010
2,083
1,802
Apple just isn't going to use an AMD CPU. They just won't. Doesn't matter what the advantages are.
 

Chase R

macrumors 65816
May 8, 2008
1,279
81
PDX
Apple just isn't going to use an AMD CPU. They just won't. Doesn't matter what the advantages are.

You're probably right. Apple has to look at these decisions on a very long-term basis. Intel is a safe bet, they aren't going anywhere and have a proven track record of continuous improvements. Not as much with AMD (yet). Perhaps a few years from now after AMD has proved itself with future Zen iterations and improvements Apple will consider using them.
 

cynics

macrumors G4
Jan 8, 2012
11,959
2,155
People here are getting hung up on core-to-core count comparisons, which is a mistake, as it is not an apples to apples comparison. Proper comparison should be on a wattage and $ basis only. Core count is irrelevant. For example, PC users don't care if a computation went through 2 cores or 32 cores, they care about how fast it got there and what it took (or cost) to get there. For comparison purposes, we should be thinking of CPUs as black boxes, and only compare them on the basis of what we can see externally, which for all intensive purposes is wattage and price.

As an example, consider the 1600X vs the 7600K. They are each's cheapest ~90-95W processor and around the same ~$250 price point. The 1600X destroys the 7600K. Don't fall for the "well it has more cores!" trap.

Other proper comparisons would be 1700X vs 7700K and 1800X vs 5820K/6800K

People need to compare CPU's based on their own usage. There is NO clear all around best.

Using your example as another example. 1600X doesn't have integrated graphics. So if you are building a SFF HTPC a Kaby Lake option maybe a better bet. Its also cheaper once you factor in a graphics card into a 1600X build of any type.

Or if you are doing a lot of video encoding. Many programs can leverage Intels QuickSync tech. Using that the 7600K is near on par with the 1800X. To get similar Ryzen performance you'll need to spend a lot more money again.

handbrake.png

Or if you plan on using Linux its a much safer bet to go with Intel. Although there is Ryzen support in many distros, AMD is no where near as hands on as Intel who goto great lengths working with the Linux community.

Or if you have Thunderbolt devices.....

There is no one size fits all and most people will need to be prepared to make sacrifices in at least one category of price, efficiency, heat, frequency, cores, etc etc.
 

Chase R

macrumors 65816
May 8, 2008
1,279
81
PDX
People need to compare CPU's based on their own usage. There is NO clear all around best.

Using your example as another example. 1600X doesn't have integrated graphics. So if you are building a SFF HTPC a Kaby Lake option maybe a better bet. Its also cheaper once you factor in a graphics card into a 1600X build of any type.

Or if you are doing a lot of video encoding. Many programs can leverage Intels QuickSync tech. Using that the 7600K is near on par with the 1800X. To get similar Ryzen performance you'll need to spend a lot more money again.

View attachment 699379

Or if you plan on using Linux its a much safer bet to go with Intel. Although there is Ryzen support in many distros, AMD is no where near as hands on as Intel who goto great lengths working with the Linux community.

Or if you have Thunderbolt devices.....

There is no one size fits all and most people will need to be prepared to make sacrifices in at least one category of price, efficiency, heat, frequency, cores, etc etc.

Very good points. I completely agree. I am looking forward to AMD's APUs due early 2018 for my Plex server/HTPC.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.