Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

D.T.

macrumors G4
Sep 15, 2011
11,050
12,460
Vilano Beach, FL
Hmm ...

Maybe a dev box that you hang off a TB3 port, so it runs ARM code natively, you compile to it (like any iOS) device, but it uses the host computer's display/KB/etc., so you wouldn't have to juggle a whole extra set of peripherals on your desktop.
 

MandiMac

macrumors 65816
Feb 25, 2012
1,431
882
I think it would need to get pretty close. Unless Apple announced a new A-series chip at WWDC, they closest thing they're shipping is the A12X/Z. The A13 has newer but fewer cores. So it is a possibility but I'd bet on the A12X/Z as closer to the over design of whatever they'll use in macARM.
A power move would be to use the octa-core A12Z which should be plenty fast already, and then announce the first ARM Mac with 12 cores (octa-core performance, quad-core efficiency). Boom and bye, Intel.
 

gnomeisland

macrumors 65816
Original poster
Jul 30, 2008
1,094
829
New York, NY
A power move would be to use the octa-core A12Z which should be plenty fast already, and then announce the first ARM Mac with 12 cores (octa-core performance, quad-core efficiency). Boom and bye, Intel.
Yeah, I see that as likely. For desktop processors I think we'll see two low-power cores and many more performance cores. Apple's A-series uses ~1.8W for a performance core. If Apple stays with similar thermal requirements they can add a lot more cores to their computers.
 

casperes1996

macrumors 604
Jan 26, 2014
7,512
5,680
Horsens, Denmark
Your point seemed to me that Apple already is doing CPU emulation, and would increase this to do bi-directional CPU emulation. But they aren't doing CPU emulation, and anyone who's had to use it, knows it's essentially the "lipstick on a pig" / "polished turd" of compatibility solutions.

I think perhaps I made my point poorly then. What I was getting at was more than I don't necessarily think it would be an insane amount of effort for Apple to do, not that they're already doing it. All I used in my earlier demonstration was open source, and while it doesn't have the ability to translate all instructions, Apple could use that as a starting point and push changes upstream if they wanted to have a kickstart to it.
And may I ask what experience you've had with CPU-arch translation to say it's like lipstick on a pig? Have you perhaps used emulated game consoles? Perhaps run PS1 games on a PS2? I don't know about you, but that seemed pretty alright to me :) - I'd say it's more about the execution than the idea.

Your own little demo makes this point quite well - roughly 3x as long to run the same calculations. While a hardware "dev kit" isn't going to be exactly what you'd expect to ship to customers, if the Intel transition is anything to go by, you'd assume it'll be *somewhat* representative. The Intel Transition system had a 3.6GHz P4, the initial Intel Macs had roughly ~2GHz Intel Core Solos and Core Duos. The transition system was much less polished (no EFI, no GPT disks, etc) but it was still close enough hardware that developers had some sort of idea how things would run.

I would say that a 3x slowdown is irrelevant honestly. Unless you're building performance crucial apps, which let's face it, most developers are not, knowing that "it works" is good enough. And if you can make it run well at 3x slowdown, even better. Besides, 3x slower than a modern Mac, isn't that slow anyway, and if you have a higher end Mac, which developers are likely to, your performance might be fairly in line with the first ARM laptop anyway. - But again, I'd argue that performance accuracy, for most developers, is largely irrelevant. Unless you're writing real-time software or games, it's not that important. And for both those roughly the same performance might not be good enough either anyway.
 

Stephen.R

Suspended
Nov 2, 2018
4,356
4,746
Thailand
What I was getting at was more than I don't necessarily think it would be an insane amount of effort for Apple to do
I think the "difference" of our opinions is that yours seems to be based around the idea that emulating another CPU at anything greater than mediocre performance is just a matter of throwing enough engineers/money at the problem. You might say that I am more pessimistic than that. I'd say I'm just more realistic.


And may I ask what experience you've had with CPU-arch translation to say it's like lipstick on a pig?
Well, I used Connectix Virtual PC back in the day, which I'm guessing is before your time. It emulated, wait for it - an x86 PC, on a PPC Mac. Really, calling that lipstick on a pig, is doing a disservice to any pigs who like to glam up. It was ****ing abysmal. A number of years after that (but still several years ago) I *attempted* to run the Android Emulator on an i7 MBP. That wasn't a great experience either, but I suspect that was very much a case of the tooling around it.


Have you perhaps used emulated game consoles? Perhaps run PS1 games on a PS2?
You do know that the GPU is the only thing emulated in the PS2 when playing PS1 games right?
 

casperes1996

macrumors 604
Jan 26, 2014
7,512
5,680
Horsens, Denmark
I think the "difference" of our opinions is that yours seems to be based around the idea that emulating another CPU at anything greater than mediocre performance is just a matter of throwing enough engineers/money at the problem. You might say that I am more pessimistic than that. I'd say I'm just more realistic.

I would've said the major difference is whether we think mediocre is good enough

Well, I used Connectix Virtual PC back in the day, which I'm guessing is before your time. It emulated, wait for it - an x86 PC, on a PPC Mac. Really, calling that lipstick on a pig, is doing a disservice to any pigs who like to glam up. It was ****ing abysmal. A number of years after that (but still several years ago) I *attempted* to run the Android Emulator on an i7 MBP. That wasn't a great experience either, but I suspect that was very much a case of the tooling around it.

It may not have been how you meant it, but to me this sounded a tad condescending. There's a G3 and two G4s in my closet. It's irrelevant however.
How well the Android emulator runs very much depends on whether your chip has the VT, VT-x, vmx instructions or not, which allow hardware acceleration that it otherwise doesn't utilise on the virtualisation layer. An i7 isn't just an i7 so YMMV. But I think the Android emulator ran decently fine honestly. Android code on the other hand... Use Fragments Google said. They're great!... Google Maps requires an Activity... OK. Tab View requires its own Activity. FFS. Anyway, tangent.

macOS can run on the 12" MacBook at an acceptable speed. If a higher end iMac can't translate ARM instructions to x86 at equivalent speeds to the m3 chip used on some of those, I'd be amazed frankly. And translating ARM->x86 is easier than the other way around anyway. At least to get it running. Getting it high performance is probably roughly the same amount of work, but for a start it wouldn't require a lot of effort.

And remember we're not talking about a daily use thing here. The target is still to have native code running. But for testing that an ARM binary behaves properly; It seems adequate, as long as it is trusted behaviour equality.

You do know that the GPU is the only thing emulated in the PS2 when playing PS1 games right?

I didn't until I looked it up just now, no. Had no idea that the PS2 included the CPU from the PS1 as its I/O chip but thanks for leading me to that discovery. But I consider my point to be no less relevant; Dolphin emulatinting a Wii can be our new example instead then.
 

gnomeisland

macrumors 65816
Original poster
Jul 30, 2008
1,094
829
New York, NY
Welp. Called that one wrong. But hey there IS a DTK box! Very excited for this transition. More so than PPC to Intel. Honestly don't remember how I felt about 68k to PPC, I was still pretty young and couldn't afford a PPC Mac.
 

casperes1996

macrumors 604
Jan 26, 2014
7,512
5,680
Horsens, Denmark
Welp. Called that one wrong. But hey there IS a DTK box! Very excited for this transition. More so than PPC to Intel. Honestly don't remember how I felt about 68k to PPC, I was still pretty young and couldn't afford a PPC Mac.

Yeah, this is sort of exactly what happened last time, except a mini instead of a Mac Pro style chassis for the DTK.
I can't tell if I'm more scared or excited though. All their Macs in 2 years; I'm kinda scared about the high end of that. And I'm still not sure how I feel about the macOS Big Sur design; I definitely dislike a lot of what I've seen so far, but it might come together nicely in the end. It feels sad moving past macOS X though. Going to 11, is... Hard.
 

gnomeisland

macrumors 65816
Original poster
Jul 30, 2008
1,094
829
New York, NY
Yeah, this is sort of exactly what happened last time, except a mini instead of a Mac Pro style chassis for the DTK.
I can't tell if I'm more scared or excited though. All their Macs in 2 years; I'm kinda scared about the high end of that. And I'm still not sure how I feel about the macOS Big Sur design; I definitely dislike a lot of what I've seen so far, but it might come together nicely in the end. It feels sad moving past macOS X though. Going to 11, is... Hard.
I am such an old man, but we've been down this road. macOS X was a huge, huge change. Upset a lot of loyal Mac users. As someone who used both Mac and Linux, I couldn't wait but change is hard and we leave good things behind.

I don't hate Big Sur, it just doesn't feel like a step forward (visually).

I'd be surprised if it takes Apple a full two years to transition. They gave themselves the same timeline for Intel and it was less than a year. My guess is they've got all the 1st gen chips/hardware planned out but not properly fabbed so they're giving themselves some wiggle room. If all goes well we will see a full lineup by late 2021 or very early 2022. WWDC 2022 will probably see the start of ARM-centered development features. Intel will be "supported" for years but feature parity will widen quickly.
 

casperes1996

macrumors 604
Jan 26, 2014
7,512
5,680
Horsens, Denmark
I am such an old man, but we've been down this road. macOS X was a huge, huge change. Upset a lot of loyal Mac users. As someone who used both Mac and Linux, I couldn't wait but change is hard and we leave good things behind.

I don't hate Big Sur, it just doesn't feel like a step forward (visually).

I know what you mean. It is still hard though. OS X has served us so well for 20 years, and I'm not sure I'm ready for macOS 11. From the screenshots and videos, Big Sur looks harder for my eyes, which isn't great since accessibility is one of the reasons macOS has always felt so good to me; I have bad eyesight. Though Apple still seems to care about that.
I miss the old icon style. I miss the old menu bar; And I still have it as I'm typing this out on Mojave while Big Sur is installing to my test drive.
I just really don't want my Mac to become a big iPad. - Though I kinda like how my iPad is becoming more like a small Mac, now that it's got something more like Spotlight as its search in iOS 14.

I'd be surprised if it takes Apple a full two years to transition. They gave themselves the same timeline for Intel and it was less than a year. My guess is they've got all the 1st gen chips/hardware planned out but not properly fabbed so they're giving themselves some wiggle room. If all goes well we will see a full lineup by late 2021 or very early 2022. WWDC 2022 will probably see the start of ARM-centered development features. Intel will be "supported" for years but feature parity will widen quickly.

That is true, but Intel also already had a plethora of chips for them to pick from at that point that they were supplying to PC manufacturers. Apple has A12Z and perhaps the next one ready, but scaling up to something like a 28 core Mac Pro? I think that could take the full two years. - Plus, Tim said that there are still Intel Macs in the pipeline. When they switched from PPC to Intel, I'm pretty sure it was a clean cut, line in the sand, from this point on only Intel. Though I could be misremembering that.

I honestly expected it to only be a partial transition with iMac Pro, Mac Pro, higher end iMac and 16" MacBook Pro staying Intel for at least another 5 years. I hadn't expected Apple to be going full on with their own SoCs scaling all the way to what those machines would need, even up to the kinds of memory controllers and PCIe lanes necessary on a Mac Pro. I'm really excited to see what that will be like, but also terrified of it going tits up.

Cheers
 

gnomeisland

macrumors 65816
Original poster
Jul 30, 2008
1,094
829
New York, NY
That is true, but Intel also already had a plethora of chips for them to pick from at that point that they were supplying to PC manufacturers. Apple has A12Z and perhaps the next one ready, but scaling up to something like a 28 core Mac Pro? I think that could take the full two years. - Plus, Tim said that there are still Intel Macs in the pipeline. When they switched from PPC to Intel, I'm pretty sure it was a clean cut, line in the sand, from this point on only Intel. Though I could be misremembering that.

I honestly expected it to only be a partial transition with iMac Pro, Mac Pro, higher end iMac and 16" MacBook Pro staying Intel for at least another 5 years. I hadn't expected Apple to be going full on with their own SoCs scaling all the way to what those machines would need, even up to the kinds of memory controllers and PCIe lanes necessary on a Mac Pro. I'm really excited to see what that will be like, but also terrified of it going tits up.

Cheers
Nah. Apple's got much more prepped than you think. Go back and watch the 2005 Intel transition announcement. You could change the slides, edit out the names, and use it again this time around. Intel hasn't reliably hit it's roadmap in *years* just like IBM/Motorola with the PowerPC. Even if Apple doesn't have Mac Pro level chips (and I believe they do, because they'll need to keep the custom bits consistent) there are competitive ARM processors from companies like Ampere that Apple could use.

In 2005, Steve Jobs said the same thing about having PPC Macs in the pipeline and they did release several after WWDC 2005. Many professionals will need maximum Intel performance—just as they did the G5 in 2006. Rosetta 2 will be great, but it won't be as fast. Native apps, which I think will come faster this time, will be where it is at.

Apple already has the best performance ARM cores for general computing. From what I've seen, they have competitive (if not the best) memory and storage controllers. Hard to compare NPUs and GPU compute, but Apple's are competitive there too—and that's with mobile chips! Scaling up isn't linear but they don't have to scale much to have a top contender. The A13 (on a phone!) is already as fast at single threaded operations as almost any Intel or AMD chip running at ~2X (or more) power consumption. So Apple can have 2-3X as many performance cores for the same thermals.

Long term, I'm worried that Apple will get lost in the weeds or lose their way, but in the short/medium term (5-10 years) the future is very, very bright.
 
  • Like
Reactions: iamMacPerson

casperes1996

macrumors 604
Jan 26, 2014
7,512
5,680
Horsens, Denmark
Nah. Apple's got much more prepped than you think. Go back and watch the 2005 Intel transition announcement. You could change the slides, edit out the names, and use it again this time around. Intel hasn't reliably hit it's roadmap in *years* just like IBM/Motorola with the PowerPC. Even if Apple doesn't have Mac Pro level chips (and I believe they do, because they'll need to keep the custom bits consistent) there are competitive ARM processors from companies like Ampere that Apple could use.

In 2005, Steve Jobs said the same thing about having PPC Macs in the pipeline and they did release several after WWDC 2005. Many professionals will need maximum Intel performance—just as they did the G5 in 2006. Rosetta 2 will be great, but it won't be as fast. Native apps, which I think will come faster this time, will be where it is at.

Apple already has the best performance ARM cores for general computing. From what I've seen, they have competitive (if not the best) memory and storage controllers. Hard to compare NPUs and GPU compute, but Apple's are competitive there too—and that's with mobile chips! Scaling up isn't linear but they don't have to scale much to have a top contender. The A13 (on a phone!) is already as fast at single threaded operations as almost any Intel or AMD chip running at ~2X (or more) power consumption. So Apple can have 2-3X as many performance cores for the same thermals.

Long term, I'm worried that Apple will get lost in the weeds or lose their way, but in the short/medium term (5-10 years) the future is very, very bright.

Maybe you're right, mate.

I don't have any fears about the low to mid tier of the market. My iPad's A12X is, as Steve Jobs might've put it, a screamer. But I fear about the ultra high end, and I doubt they'd use another manufacturer's ARM chips.
There's more to a CPU than what we've seen from Apple silicon so far. To fit in something like a Mac Pro there needs to be a lot of PCIe lanes on that chip as well. And I haven't looked into their instruction set for the ARM chips, but Intel has AVX instructions (which aren't supported by Rosetta at all) for wide SIMD computations. The YMM registers on some Intel chips are 512 bits wide. That means 8 or even 16 int64 or int32 values modified at once in a single instruction. For programs that use those kinds of instructions, I don't know of any widely available ARM alternative. Apple's silicon may have an extension to the ARM instruction set that supports similar SIMD instructions, but to my knowledge it's not in standard ARM. And while regular instructions may be faster on Apple's chips clock for clock, I don't know how they'd compete with AVX in the scientific community especially. Though I suppose the strategy may just be to advice using GPU acceleration, since if your instruction can be parallelised into AVX it's likely to also fit GPGPU acceleration.

Time will tell I guess :)
 

gnomeisland

macrumors 65816
Original poster
Jul 30, 2008
1,094
829
New York, NY
Maybe you're right, mate.

I don't have any fears about the low to mid tier of the market. My iPad's A12X is, as Steve Jobs might've put it, a screamer. But I fear about the ultra high end, and I doubt they'd use another manufacturer's ARM chips.
There's more to a CPU than what we've seen from Apple silicon so far. To fit in something like a Mac Pro there needs to be a lot of PCIe lanes on that chip as well. And I haven't looked into their instruction set for the ARM chips, but Intel has AVX instructions (which aren't supported by Rosetta at all) for wide SIMD computations. The YMM registers on some Intel chips are 512 bits wide. That means 8 or even 16 int64 or int32 values modified at once in a single instruction. For programs that use those kinds of instructions, I don't know of any widely available ARM alternative. Apple's silicon may have an extension to the ARM instruction set that supports similar SIMD instructions, but to my knowledge it's not in standard ARM. And while regular instructions may be faster on Apple's chips clock for clock, I don't know how they'd compete with AVX in the scientific community especially. Though I suppose the strategy may just be to advice using GPU acceleration, since if your instruction can be parallelised into AVX it's likely to also fit GPGPU acceleration.

Time will tell I guess :)
ARM has several options for SIMD. NEON is the most common and has been around since Cortex-A8 (~2011). Fujistu has a custom solution that supports 512-bit wide instructions—incidentally, they make the ARM chips powering the top super computer in the world. I'm sure if NEON isn't up competitive Apple already has a custom strategy in the works.

I'm not worried at all about the high end possibilities—just curious to see how Apple decides to execute. Again, many people were very skeptical that Xeon could match the Altivec SIMD in the PowerPC processors back in 2005. The G5 killed (for it's time) in SIMD. Apple has been here before.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.