Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

klasma

macrumors 603
Jun 8, 2017
6,140
17,185
It seems crazy to use M2 chips to power AI when the M3 and M4, by Apple's own admission, come with a hugely upgraded Neural Engine. What on earth are they thinking?! Maybe there's just a huge surplus of unsold M2 Ultra chips that they need to do something with...
They need the manufacturing capacities of the 3nm process for the new iPhones and iPads. And it's not a problem to run as many M2 Ultra in parallel as needed for whatever they do.
 

Squirrrrel

Suspended
Apr 24, 2024
158
300
Screenshot 2024-05-09 at 10.09.36 PM.png


@eas

I'm literally dumbfounded trying to figure out how you disagreed with me here. In a recent interview with NVIDIA's CEO Jensen Huang, he made this exact statement.
 
  • Like
Reactions: TechnoMonk

giebe

macrumors regular
Mar 20, 2014
205
296
Germany
One feature I'm hoping for: Summarize full ebooks or specific sections, mark the most important things in books app. Cool addition would also be: read the book - make every book an audio book. This would be a great AI feature.
 
  • Like
Reactions: shadowboi

Motorola68000

macrumors 6502
Sep 12, 2022
312
295
one query. It was confirmed that the M1, M2 and M3 series all contain a security flaw within the hardware. With comments about customer security is it wise to utilise flawed hardware?

With regards the latest buzz of AI, it is worth perusing the following article as its nothing new. Even Siri with all its faults can be described as AI, as are the awful newer answerphone systems that seem to confer the title Artificial Ignorance rather than 'intelligence'
 
Last edited:

Kierkegaarden

macrumors 68020
Dec 13, 2018
2,396
4,062
USA
My experience is the opposite. I am a scientist and professor (I like to use my brain) and I use ChatGPT almost every day to help speed up my research and work. In fact, every person I know who uses ChatGPT or something similar regularly are people who think and do a lot -- researchers, artists, professors, engineers, etc. I'm sure my sample is biased, but LLMs have been one of the best tools ever developed for much of what I do. They help me get things done much more efficiently.
Do you pay to use ChatGPT?
 

shadowboi

macrumors regular
Feb 16, 2024
220
398
Unknown
Nvidia 4090 or whatever may have more power in certain situations, but if you are paying for all of that in green energy then Apples M chips start looking more attractive.
If I ever tried writing “NVIDIA chips are power-consuming” somewhere on reddit (even in r/macgaming!), they would have downvoted me into void.

But it is true. People saying NVIDIA can do premium graphics, but at what cost? TDP of an average gaming PC with NVIDIA card is at least 500w/h (and usually around 800), card alone is 120-180w, even my coffee grinder draws 65W, and I use it only for 30 seconds a day. This is stupidly insane, even my old Intel iMac draws only 70w under load. New ARM Macs are at least twice more efficient.

If computer tech companies actually had to care about how much power their stuff draws, we would not have only lived in greener world but also have had new games on every device, even on Apple TV.

New age gatekeeping is when you want to play some new game like Hogwarts Legacy or NFS Unbound and you just cannot, you gotta own elite gaming PC with sluggish spying Windows OS installed. And yeah I am sure developers can make stripped out versions of these games to run on cheaper devices like new iPads or Macs. They just don’t want to
 

spittt

macrumors newbie
Apr 6, 2016
9
7
They could do the same with GPU.
The right question is, how many M2 Ultra does it take to reach b200 in AI performance ?
Actually, the right question is, why would Tim Cook agree to lock Apple into a Cuda subscription for all their AI needs? Not only are the premiums on NVIDIA servers way too bloated but they also want Apple to build on top of their proprietary stack when Google, Intel, Qualcomm, ARM, etc. will release open source UXL stack for GPU's by year end. No way that Tim Cook will allow Apple to be fleeced by NVIDIA. He stated in earnings call last week that Apple is using hybrid solutions for now such as AWS, Google cloud. So these other back ends for them may use some NVIDIA servers but these appear to be stop gap solutions until Apple Silicon AI server chip is readied. IMO there is zero chance of Apple wanting to lock themselves into an NVIDIA / Cuda stack for their own AI iCloud server farms.

As another poster commented above, they both have same chip architecture, i.e., ARM. Thus far, Apple focused on efficiency and thermal management. Now that they are turning their attention to AI server chips, Apple is more than capable of catching up to NVIDIA. Their future AI server platform already has 2 billion plus growing device user base they can tap on as data sources for their LLM. Why do you suppose NVIDIA has had so much insider selling? They don't have a clue where they will be in one or two years as UXL for GPU's matures.
 

Macalicious2011

macrumors 68000
May 15, 2011
1,759
1,789
London
Actually, the right question is, why would Tim Cook agree to lock Apple into a Cuda subscription for all their AI needs? Not only are the premiums on NVIDIA servers way too bloated but they also want Apple to build on top of their proprietary stack when Google, Intel, Qualcomm, ARM, etc. will release open source UXL stack for GPU's by year end. No way that Tim Cook will allow Apple to be fleeced by NVIDIA.
Well said. The purpose of shifting from intel to apple silicone was greater control of the evolution and lifecycle of Apple product.

AI is a new generation of computing. Relying on Nvidia would take Apple back 5 years as they would not only depend on Nvidia hardware but also Nvidia's tech stack.
In my experience with talking about it with people, the only people who support and want AI, are people who tend to not like thinking. Those of us that like to use our brains, are fine without AI. Just my experience. YMMV
I hope you realise that AI is already powering lots of products and features that people know and love it:
-Face/Touch ID
-Content algorithm recommendation on Instagram and TikTok

Furthermore we use computers to calculate things for us in Excel and other software.

What some of the smartest people in the world are great at doing is scaling their time and energy. This includes finding ways of getting 6 hours of work done in 1 hour using resources avaiable e.g computing power, other people etc.
 
  • Like
  • Haha
Reactions: rchornef and spittt

Macalicious2011

macrumors 68000
May 15, 2011
1,759
1,789
London
"Introducing our magical, incredible iCloud AI Pro Max... starting at only $69/month.

We think you'll love it!" ;)
I think apple will just bake the costs of the ai features into iCloud and increase the annual subscription cost by 1-3% every year.

It's easier to make 500m users pay an extra $0.5 per month extra for a service that they already use than to get 50m users to pay $5 per month for a new service.
 
  • Like
Reactions: spittt

Smittywerben

macrumors newbie
Mar 26, 2024
29
85
Actually, the right question is, why would Tim Cook agree to lock Apple into a Cuda subscription for all their AI needs? Not only are the premiums on NVIDIA servers way too bloated but they also want Apple to build on top of their proprietary stack when Google, Intel, Qualcomm, ARM, etc. will release open source UXL stack for GPU's by year end. No way that Tim Cook will allow Apple to be fleeced by NVIDIA. He stated in earnings call last week that Apple is using hybrid solutions for now such as AWS, Google cloud. So these other back ends for them may use some NVIDIA servers but these appear to be stop gap solutions until Apple Silicon AI server chip is readied. IMO there is zero chance of Apple wanting to lock themselves into an NVIDIA / Cuda stack for their own AI iCloud server farms.

As another poster commented above, they both have same chip architecture, i.e., ARM. Thus far, Apple focused on efficiency and thermal management. Now that they are turning their attention to AI server chips, Apple is more than capable of catching up to NVIDIA. Their future AI server platform already has 2 billion plus growing device user base they can tap on as data sources for their LLM. Why do you suppose NVIDIA has had so much insider selling? They don't have a clue where they will be in one or two years as UXL for GPU's matures.
You are missing the point, Apple, according to this article, is not making an AI specific chip, but is trying to use their outdated M2 Ultra to compete with hardware that are much faster in AI and more efficient ultimately.
It's like trying to mine crypto with GPU and compete with ASIC miners that are built specifically for that task.
If Apple makes their own optimized hardware for AI, something that Google did for Gemini's LLM and was successful with it, is a whole other story.
Apple also thought that they can compete with Qualcomm for modem chip because they were successful to make their own SoC, but failed miserably, I wouldn't claim they can catch up to Nvidia in AI anytime soon unless it actually happens, but that's Apple's Tim Cook in a nutshell these days, try to catch up with everyone else because you are always behind.
Those who believe Apple can compete with Nvidia in AI using M2 Ultra are the same people who believed 8GB of ram on mac is 16GB on windows .
Let's face it, we all know exactly why Apple is not going to use Nvidia GPU's.
s810hdhqmqnc1.png
 

Macalicious2011

macrumors 68000
May 15, 2011
1,759
1,789
London
You are missing the point, Apple, according to this article, is not making an AI specific chip, but is trying to use their outdated M2 Ultra to compete with hardware that are much faster in AI and more efficient ultimately.
It's like trying to mine crypto with GPU and compete with ASIC miners that are built specifically for that task.
If Apple makes their own optimized hardware for AI, something that Google did for Gemini's LLM and was successful with it, is a whole other story.
Apple also thought that they can compete with Qualcomm for modem chip because they were successful to make their own SoC, but failed miserably, I wouldn't claim they can catch up to Nvidia in AI anytime soon unless it actually happens, but that's Apple's Tim Cook in a nutshell these days, try to catch up with everyone else because you are always behind.
Those who believe Apple can compete with Nvidia in AI using M2 Ultra are the same people who believed 8GB of ram on mac is 16GB on windows .
Let's face it, we all know exactly why Apple is not going to use Nvidia GPU's.
View attachment 2376505
Well said. Nvidia are years ahead of their nearest rivals.

However it's worth splitting ai into two:
Training
Inference

Nvidia chips are superior for training but overkill for B2C inference. Apple's use of M2 chips is a good interim solution for simple inference tasks.

I also expect groq and other up and coming chip manufacturers to offer alternatives to nvidia for inference.
 

MrGimper

macrumors G3
Sep 22, 2012
8,653
12,210
Andover, UK
One feature I'm hoping for: Summarize full ebooks or specific sections, mark the most important things in books app. Cool addition would also be: read the book - make every book an audio book. This would be a great AI feature.
That makes sense ... one good thing about Copilot is to create a summary of a huge design document.
 

SmugMaverick

macrumors 6502a
Aug 31, 2017
722
1,940
UK
"Privacy is at the core of Apple"

Unless you live in China, use our servers for AI and ignore all the stuff about "on device for your privacy" stuff.
 

klasma

macrumors 603
Jun 8, 2017
6,140
17,185
One feature I'm hoping for: Summarize full ebooks or specific sections, mark the most important things in books app.
LLM context windows generally aren't large enough for full books (yet).

Cool addition would also be: read the book - make every book an audio book. This would be a great AI feature.
Apple could already do this today, since they have that feature for web pages in Safari (https://support.apple.com/guide/iphone/use-siri-to-listen-to-a-webpage-iph449fc616c/ios). They won't do this for ebooks, however, because that would conflict with audiobook rights, and publishers would pull their books from Apple. Besides, it would diminish Apple's profits from audiobook sales.
 

_iCeb0x_

macrumors member
Dec 19, 2003
31
2
Apple needs to allow multiple studios to have cluster of GPU and memory.
I thought Apple could have released a Mac Pro that was upgradable by adding boards/blades: processing, storage, etc. These boards would be interconnected using something like InfiniBand or a proprietary bus based on a torus topology.

The way I see it (I might be missing something here), there would be no need to come up with these “sticking two Ax Pro/Max to each other” ideas, and machines could escalate to ridiculous amounts of processing, memory and storage, for rendering videos, scientific computing, AI and other massive workloads.
 
  • Like
Reactions: TechnoMonk

SoldOnApple

macrumors 65816
Jul 20, 2011
1,106
1,850
Actually, the right question is, why would Tim Cook agree to lock Apple into a Cuda subscription for all their AI needs?
AI itself is worth it at any cost. We've been using generative AI for a few years now and are starting to take it for granted, Apple can't delay a few years to sort out a better deal for themselves.
 

Macalicious2011

macrumors 68000
May 15, 2011
1,759
1,789
London
AI itself is worth it at any cost. We've been using generative AI for a few years now and are starting to take it for granted, Apple can't delay a few years to sort out a better deal for themselves.
So apple should risk their 2Trillion dollar evaluation byt kissing Jensen's ring and become a captive customer?
 

Darragh83

macrumors newbie
Oct 4, 2019
9
16
Limerick, Ireland
as someone who uses chatgpt i can see what the hype is all about, its like when google and youtube came out. However I would love if apple just integrated with chapgpt instead of creating their own, its like the iPhone using google search engine and I believe the majority use google maps. Time will tell, but i still don’t use siri after all the years since it came out.
 
  • Like
Reactions: erthquake
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.