Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Kingcoherent

macrumors member
Aug 30, 2022
68
68
Well. RAM fans will be happy they're getting 512GB. The 1.5 TB+ fans will still cry it's not enough.

Funny thing is, a top spec'd consumer grade PC [Non-Workstation] tops at 256GB of DDR5 RAM support.
Support for richer AI models (i.e. multimodal) will require more RAM. I would actually expect them to finally bump the lower end too.

512GB is a lot. If you really need more RAM you are likely to cluster a number of machines. I would be very interested to see if they bring back the Xserve.
 

rick3000

macrumors 6502a
May 6, 2008
646
269
West Coast
Apple's had the Neural Engine for years. I fully expected them to rebrand everything to AI, because that's the current buzzword, and the stock has been getting hammered (vs everyone else) because they haven't been seen to be involved with AI.

I wonder if they will relaunch the Xserve to get in on the money flooding into Nvidia.
 

Kingcoherent

macrumors member
Aug 30, 2022
68
68
Apple isn’t making that better. Sorry for the spoiler.
RAG focussed LLMs (possibly building on top of Apple's spotlight search) would be a marked improvement.

tbh I don't care if it's apple or someone else who delivers it. At present Apple computers aren't powerful enough for useful AI tools. A beefier ANE + more memory should the give the hardware the capability currently missing.
 

hovscorpion12

macrumors 68030
Sep 12, 2011
2,698
2,679
USA
And often less. For instance, the max RAM for the Dell XPS tower is 64 GB. Even if you consider the max possible RAM supported by the chipsets themselves, the limits for any consumer PC made with a Ryzen 9 or i9-14900K are 128 GB and 192 GB, respectively. Indeed, I'm curious what consumer chipset you found that could support 256 GB.

Thus Apple's top consumer-grade chip, the M3 Max, which allows up to 128 GB RAM, is certainly in the running.

But then again, the Mac Pro isn't a consumer-grade PC. It's a workstation. And PC versions of those currently support 1 TB – 2 TB RAM (depending on the chip).
Asus ROG sent a Bios update back in March 17th fully supporting 256GB of DDR5 on the
Intel 700, 600 series, and AMD AM5 motherboards.

granted its desktop only.


I have the Z790 Carbon WiFi with 13900KF/4090. Waiting for MSI to send a bios update (if they can). They did recently update the firmware to 196GB.
 

ignatius345

macrumors 604
Aug 20, 2015
6,978
11,445
that M1 computer could last almost indefinitely. Years and years.
Working great today, but if the past is any guide, these machines will start slowing down more over the course of upcoming MacOS releases after a certain point, I'm afraid.

I've never had that not happen in my years of using Apple products. Every Mac, iPad, etc starts off blazing fast but eventually struggles to handle the burden of ever more resource-intensive software.

The only way out is to freeze your system, but most of us don't care to do that.
 

michaeljk

macrumors regular
Dec 14, 2013
135
165
Apple's "AI" path has been terrible for many years. I remember concluding (and maybe posting here or elsewhere) that Apple's single biggest vulnerability is its reliance on Siri to expand its UI to take advantage of all kinds of new interface processing capabilities. Siri was a bandaid or a stopgap that was probably a mistake to begin with, but reliance on it for more than a year or two when they should have been working on a parallel path to replace it from the beginning was and is a huge barrier for them. Siri is terrible in all of its implementations, on the iPhone, HomePod and CarPlay. I basically never use it, but always wish I could. I just bought a new Sony TV, which has Google TV built in. I bought an Apple TV 4K with it. I told my wife I plan to return the Apple TV as I move toward Google for my Home "AI" setup. Sad, because I do not trust Google, at all, with my privacy. But it is years ahead of Apple in terms of ease of use for its "smart" technology. I still do not understand why Apple didn't see this 10 years ago, and still doesn't seem to see it now. Maybe its the "emperor has no clothes" syndrome, or like the guy who gives a weird, stupid response on Family Feud, but the family says, "yeah, great answer, great answer" and claps for him. Apple's C suite is saying,"Siri is our future, we can make it better, it will catch up, just give it more resources...." Dumb, and sad.
 

aknabi

macrumors 6502a
Jul 4, 2011
536
866
Ah great... Xcode builds won't improve much, but Siri will be way quicker and more diverse in its "I won't reply to that" phrasing when I curse it for utterly screwing up or interfering.
 

avkills

macrumors 65816
Jun 14, 2002
1,182
985
I have alot of menu items and I have had menu items hidden under that notch.

It should simply not be there and simply have a thin bezel.

If the Mac had Face ID, then maybe I could understand having a notch. But without Face ID, the notch is ridiculous.

Thin bezels > Notch.
People have already chimed in and said Face ID is impossible due to the thickness that would be required to house the sensor; unless you want the display lid to be the same thickness as a iPhone.

Adding a gazillion items to your menu bar is a user choice that you have made; if it is really that much of a burden then get an external monitor and then just deal with it when on the go. 🤷‍♂️

or just run the computer in the mode that puts the menu bar below all that.
 

Victor Mortimer

Suspended
Apr 17, 2016
825
1,443
That space was unusable before anyway because it was taken up by... the same webcam. And even worse, it had just black bezel next to it prior to 2021. Would you rather they bring the bezel back? Or invent an invisible webcam that works through an LED screen and defies the laws of physics?

YES. I'd MUCH rather they bring the bezel back. Or just get rid of the stupid cam entirely, that would be fine too. It's not like I don't have a sticker permanently over the cam anyway.

Webcams are a bug, not a feature.
 

avkills

macrumors 65816
Jun 14, 2002
1,182
985
YES. I'd MUCH rather they bring the bezel back. Or just get rid of the stupid cam entirely, that would be fine too. It's not like I don't have a sticker permanently over the cam anyway.

Webcams are a bug, not a feature.
Well we actually agree on one thing in that I would also not care if they removed the webcam either.
 
  • Like
Reactions: Victor Mortimer

dandeco

macrumors 65816
Dec 5, 2008
1,199
1,008
Brockton, MA
More reason for me to wait until hopefully near the end of this year to get a pro-level Mac Mini, especially since I still need to save up after buying my current car, along with likely commissioning a new costume later this summer. Won't be cheap!
 

avkills

macrumors 65816
Jun 14, 2002
1,182
985
All things considered it will probably quite a while before I personally upgrade again. The M3 Max for work is just very good at all the things I need to do for work. First time in history my work machine is either equal or better than my own personal machine.
 

MilaM

macrumors 6502a
Nov 7, 2017
726
1,577
Great. One more reason to postpone buying that Mac mini for another six months. Hopefully the RAM upgrade prices will be more reasonable this time.
 
  • Like
Reactions: dandeco

AlexMaximus

macrumors 65816
Aug 15, 2006
1,186
544
A400M Base
Now it's going to be AI everything because they don't know what else to market :/
Perfect. After using my 14 year old MacBook Pro 17' I really look forward to upgrade the old girl to M4. I will renew my entire Intel platform to Apple Silicon. This will hurt, ouch... Finally it makes sense with next to nothing on the innovation horizon. I do hope, that the new stuff will at least last as long as my current golden era legacy systems. Thanks to Open Core Legacy Patcher and many others hard- and software tweaks on this forum, all my devices got real super-duper economical with such a long life cycle. This is an outstanding forum, the best in the industry.
 

AlexMaximus

macrumors 65816
Aug 15, 2006
1,186
544
A400M Base
Apple is so behind on AI. It's even looking like they are getting more behind and not even catching up. People keep saying Apple is often late but better, as if that some sort of vindication. But that's not even true -- Apple does not always come up with something better. But what I want is for Apple to be consistently earlier and better.
I fully understand you. But, boy have I been happy with my past Intel Apple platform over the last 9 years. I could not have been more happy. After using Apple for more than a decade now, they really really earned my trust and my loyalty as a semi-pro customer. Not everything was 1000%, but all things considered, Apple tech performed top notch for my usage, with the occasional support of genius hacks like Open Core, DosDude1 and others.
 

Biro

macrumors 6502a
Jan 11, 2012
583
922
Apple's "AI" path has been terrible for many years. I remember concluding (and maybe posting here or elsewhere) that Apple's single biggest vulnerability is its reliance on Siri to expand its UI to take advantage of all kinds of new interface processing capabilities. Siri was a bandaid or a stopgap that was probably a mistake to begin with, but reliance on it for more than a year or two when they should have been working on a parallel path to replace it from the beginning was and is a huge barrier for them. Siri is terrible in all of its implementations, on the iPhone, HomePod and CarPlay. I basically never use it, but always wish I could. I just bought a new Sony TV, which has Google TV built in. I bought an Apple TV 4K with it. I told my wife I plan to return the Apple TV as I move toward Google for my Home "AI" setup. Sad, because I do not trust Google, at all, with my privacy. But it is years ahead of Apple in terms of ease of use for its "smart" technology. I still do not understand why Apple didn't see this 10 years ago, and still doesn't seem to see it now. Maybe its the "emperor has no clothes" syndrome, or like the guy who gives a weird, stupid response on Family Feud, but the family says, "yeah, great answer, great answer" and claps for him. Apple's C suite is saying,"Siri is our future, we can make it better, it will catch up, just give it more resources...." Dumb, and sad.
I also have a Sony TV and an Apple TV box. I’m 100 percent with you on Siri, and have it turned off. But I’d rather do without any voice assistant before going with Google. I have the built-in microphone on the Sony TV physically turned off and use Apple TV for all of my streaming. And I have bypassed Google TV in the Sony. I don’t need any ads, suggestions or monitoring.
 
  • Like
Reactions: gusmula

dannys1

macrumors 68040
Sep 19, 2007
3,662
6,787
UK
That's true for the Ultra. I was planning to get the M3 Max version but if they launch the m4 max chip 3-4 months later, I am going to have buyer's regret.
No chance you'll get an M4 Max anytime before November - most likely be 1st quarter 2025.
 
  • Like
Reactions: Torty

Antony Newman

macrumors newbie
May 26, 2014
23
25
UK
So we can assume an M3 Mac Pro will be completely skipped.

Apple very likely has had two concurrent plans:
+) Prep a Mac Studio for M3 MAX on N3B (potentially with with more RAM and a nudge more clock frequency)
+) If TSMC N3E is on schedule in 2024_Q1, and testing of the M4 Brava stepping is good - hold back the Studio and Pro release (assuming information deliberately leaked to Gurman is accurate) to get a version of the M4 Brava in their Desktop lines in Q4.

Screenshot 2024-04-12 at 14.59.07.jpg


A jump to N3E now would mean that Apple could move off N3B 3 x quarters sooner (N3B is more expensive / has lower yields than N3E).
It would also mean a bump in chip performance of approximately 17% [=32%/sqr(25%*30%) reduction in wattage].
It could get buyers of high end desktop to want to buy Now (and not wait for M4 max laptop at the end of the year).

Screenshot 2024-04-12 at 15.45.31.jpg


If Apple will have the optimised Apple10 GPU family ready in time for the next iPhone - or if they have decided that M4 finally makes the jump to ARM v9 (with performance enhancements for AI) - it would also seem likely that the Studio and Pro would end up with late Q4 release - and WWDC would be their chance to educate developers on optimising their software for the hardware that is coming out 5 months later.

Donan / Brava / Hidra

If Brava is being utilised for high end laptops and Mac Studio - I speculate that the Donan will be single chip designs - and the Brava and Hidra will be chiplet designs that scale (lego building blocks) according to the thermal power limits (using TSMC's latest SoC packaging magic).

I would also guess that if a potential Mac Studio purchaser had the choice of getting a new machine in June with M3 Max with improved clock rates and large quantities of RAM - or have to wait until November to be (almost) first in line to make use of the M4 architecture with a machine that is better suited to AI and ML ... that the vast majority would want to wait.
 

Ishimura

macrumors newbie
Nov 6, 2022
3
8
Now it's going to be AI everything because they don't know what else to market :/

Except you’re not responding to marketing here. We’re not discussing a piece of marketing.

This whole thread is based on the headline of an article which was copied from another article, the headline of which was probably chosen by an editor. In fact this thread is mostly based on a lot of people overreacting to two words - “AI focused”.

Just because Bloomberg and then MacRumors have decided to characterise the chips that way, doesn’t mean that that is true, how Apple would characterise it or if it frankly matters at all when you can just look at the spec bumps and capabilities and decide for yourself.

When we actually see something from Apple, then we can discuss their marketing and what they do and don’t focus on.

What this story is about is (as someone else said) about the M4 chips having the usual upgrades while also having a more significant/substantial increase in one specialised type of performance. The kind that the neural engine performs.

Now, again, this isn’t marketing. But even if it was, any company is going to market any increase in the capabilities of their hardware.

If the neural engine gets a significant upgrade that significantly improves performance in applications that use it, then it is of course fair and logical to both highlight and market that, when the time comes.

What that extra performance will be, what applications it will be best used and what possible new features might be possible because of it (if any) is unknown for now.

But again, nothing is being claimed here by Apple. These news articles aren’t Apple press releases.

Nor is “AI” hardware at all new to the Mac (or iPhones/ipad) because the chips have had neural engines designed to perform those specialised workloads for a while. Yet, strangely, I don’t remember people losing their minds over the inclusion of the Neural Engine back then, nor sneering their faces off at it being improved subsequently - until recently.

Lastly it’s really strange to criticise a focus (or not) on increasing the capabilities of your hardware to deliver on certain in-demand applications.

What are Apple meant to do - NOT improve the neural engine despite an increased demand for applications that will require it? Or can they improve their hardware but just never talk about that?

Or are they allowed to talk about it but never mention a certain term because some people have forgotten that said term has been widely used (and debated) in both the engineering AND academic circles of the relevant field since virtually the invention of said field?

Are we going to either just pretend that a whole field of computing isn’t rapidly advancing and bizarrely attempt to ignore it or are we going to spin like tops while hysterically screeching and overreact everytime it’s mentioned?

Is there a third way perhaps?

P.s - have Apple announced a scheme whereby Apple users will be forced to buy M4 Macs? Where they’ll be forced to use applications they have no interest in? Maybe I missed a headline announcing that.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.