Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

sunny5

macrumors 68000
Jun 11, 2021
1,712
1,582
well yeah... what else can run these models?
Nvidia is dominating AI just because they are using Nvidia based ecosystem, software, technology and more. CUDA is a great example. And Nvidia invested AI for a decade. Yes, AI chip or NPU is much more efficient but not powerful enough to beat Nvidia GPU for now and even if they are powerful, they are only limited by their own AI models.

It's better than nothing but it's a long way to go especially since Mac is only limited for 2D based software like video, music, photo, and more.
 
  • Like
Reactions: adib

OnePieceIsReal

macrumors member
Sep 18, 2023
88
62
Only reason they release this is they need help from the community.

They also need to stop using words like empower and enrich peoples lives.


1. Selling Macbook Pros with scratches in the screen is not empowering nor enriching.

2. Selling Homepods then kill it slowly, only to resurrect it with even lower downgrades is not empowering nor enriching.

3. Still selling accessories with lightning is not empowering nor enriching.

4. Selling Goggles that pose a health risk causing blindness is not empowering nor enriching.

5. Releasing upgrades that cripple performance or introducing hardware glitches to force upgrades is not empowering nor enriching.

6. Need to sum up this and more is not empowering nor enriching.
Hey now I paid for those scratches; I heard they used Steve Jobs personal finger bones to scratch test each screen
 

heretiq

Contributor
Jan 31, 2014
839
1,321
Denver, CO
Nvidia is dominating AI just because they are using Nvidia based ecosystem, software, technology and more. CUDA is a great example. And Nvidia invested AI for a decade. Yes, AI chip or NPU is much more efficient but not powerful enough to beat Nvidia GPU for now and even if they are powerful, they are only limited by their own AI models.

It's better than nothing but it's a long way to go especially since Mac is only limited for 2D based software like video, music, photo, and more.
Apple seems to be staking out ground in the on-device AI model domain as this article demonstrates. Despite the cynical comments and attempt to cast shade on OpenELM, advancements like MLX and Ferret demonstrate that Apple is focused on building infrastructure not just models. Apple has an audience of billions of devices and privacy oriented users that are by themself a huge market. If Apple can deliver on-device, ecosystem-optimized tools, the CUDA advantage could possibly become less of and perhaps even a non-issue. I’m really hoping to see some consequential advancement in Apple ecosystem AI development tools at WWDC. 🙏🏽
 

purplerainpurplerain

macrumors 6502a
Dec 4, 2022
682
1,258
Ffs, this is not LLM..

You can’t have a full on LLM on a smart phone. It has to be ‘family friendly’ and needs guard rails to prevent it going off the rails. It has to tightly integrate with the device hardware and software without wasting user time.

When I ask Siri for SE Asia travel recommendations it should say ‘You should stay away from Myanmar right now because tourists are being kidnapped and forced to work as scammers on Facebook’.

Siri should not go off the rails and reply ‘don’t go there bro Myanmar is a steaming mountain of **** and I hope someone nukes it soon ohhhh boyyyyy Ima send my my ****ing robot drone army to start a war.’
 

Seoras

macrumors 6502a
Oct 25, 2007
763
2,016
Scotsman in New Zealand
I've worked on iOS AI Apps for several years now and I've looked long and hard at "on device".
What stops anyone with a decent, unique, intelligence model from putting it on device is having it stolen, re-packaged and resold.
iOS is pretty hardened but not unbreakable.
I asked Apple peeps, on the dev forums, and the answer I got was "yeah, that's DRM, it's not perfect"
Apple's own ML API got an encryption feature to protect proprietary models a couple of years ago but it was limited to only what ML supported.
So I don't expect a mad rush to use any on device LLM's unless there's an impressively secure way of locking in and protecting what they contain.
If the new Apple silicon (M4) has an AI focus then it'll no doubt be encased within a pretty strong silicon encryption. Something that Apple has been doing for things like biometrics, banking and ML.
It's not just about how good is your AI silicon is, it's as much about how well secured your AI model is on silicon for on device to succeed.
 

Zest28

macrumors 68020
Jul 11, 2022
2,254
3,112
[Hey ChatGPT, please generate a comment in the style of a typical MacRumors average user, with a touch of acid humor, regarding this piece of news.]

"Oh great, Apple's finally joining the open-source party—just a decade late and probably still with strings attached somewhere in those 'open' terms. They're throwing us a bone with OpenELM, but let's be real, they’re probably just doing it to lure in some AI hotshots tired of their corporate overlords. Now we just have to sit back and wait for iOS 18, where they'll inevitably limit these models to the latest hardware, forcing us all to upgrade. Because, you know, my current iPhone can't possibly handle a couple more AI tricks without combusting."

AI is better than me at making MacRumors posts.
 

backstreetboy

macrumors member
Nov 27, 2023
73
123
Apple finally decides to join the open-source party, huh? Only a decade late and I bet there are still some hidden strings attached to their so-called 'open' terms. They're throwing us a bone with OpenELM, but let's be real, they're probably just trying to lure in some AI hotshots tired of their corporate overlords. And now we have to wait for iOS 18, where they'll undoubtedly limit these models to the latest hardware, forcing us all to upgrade. Because, you know, my current iPhone surely can't handle a few more AI tricks without spontaneously combusting. Classic Apple move.
Que? They open sourced webkit and that gave Google a chance to dethrone Internet Explorer with Chrome.
 

bluecoast

macrumors 68020
Nov 7, 2017
2,225
2,644
I wonder if Macs with 8GB memory will be able to run whatever Apple bakes into macOS based on it?

I’d be surprised if even the 15 pro will be able to run anything based on this.

New iPhones for everyone. Hopes Tim Cook.
 
  • Like
Reactions: splitpea

svish

macrumors G3
Nov 25, 2017
9,990
26,016
Good to know. Waiting to hear more about everything Apple has done using AI at WWDC
 

TheOldChevy

macrumors 6502
May 12, 2020
450
800
Switzerland
I wonder if Macs with 8GB memory will be able to run whatever Apple bakes into macOS based on it?

I’d be surprised if even the 15 pro will be able to run anything based on this.

New iPhones for everyone. Hopes Tim Cook.
My experience with my basic M1pro macbook is quite good using llama3 or mistral, so I would expect that an 8 GB iPhone with an optimised model could be quite ok.
 

purplerainpurplerain

macrumors 6502a
Dec 4, 2022
682
1,258
My experience with my basic M1pro macbook is quite good using llama3 or mistral, so I would expect that an 8 GB iPhone with an optimised model could be quite ok.

From what I remember Apple’s model will stream in and out of memory dynamically instead of hogging all the memory like typical language models do.
 

clg82

macrumors 6502
Jun 17, 2010
364
192
Southern California
Does ChatGPT even attempt to run on device? You know, the whole point of this?

The thing I’ve noticed about all these AI hype-people is they certainly know what the “leader” of the pack day to day is, but somehow can’t imagine smaller models being a better solution for a given task. Instead of a one size fits all approach, what’s wrong with invoking a trained AI that is smaller but specifically suited for the task at hand *automatically based on the context of what you’re currently doing*?

Just some food for thought…
Food for thought....what makes you any different than these so called "experts" you speak of? If Apple can't even get siri to do miniscule type functions without screwing that up, that what makes you think they'll get AI right?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.