Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

NT1440

macrumors G5
May 18, 2008
14,694
21,238
Apple is obviously working on something AI related but I also understand why they're careful to announce AI explicitly because AI in general has been met with immense skepticism.
I think they see the current marketing blitz for what it is and is sticking to a bit more principle. Apple calls it machine learning, seemingly everyone else has decided AI is the buzzword for what is basically applied statistics. There’s no “intelligence” there.

Apple specifically called out now using the Transformer model in the new autocorrect at 27:42 in the Keynote. That’s a big change.

They do it again somewhere else as well. I think Apple, an amazing marketing company of all things, is weary of how much junk is just being slapped with “AI” to wow investors and media that are truly just Machine Learning at the end of the day.
 
  • Like
Reactions: LinusR and Tagbert

Tagbert

macrumors 603
Jun 22, 2011
5,570
6,460
Seattle
Ok, but Siri is the consumer facing indication of AI, and Apple would do well to at least increase its abilities tenfold by the September event. It would be great if we didn’t even have to wait that long. Don’t call it AI, continue to call it, Machine Learning, whatever, but put Siri more centerstage.
While I agree that Siri could use a little LLM brain transplant, I doubt that Apple will do it as soon as this September. Such a task will take some time and Apple will want to experiment with LLM-style AI before they commit it to such a high profile feature. A small-scale experiment like the Transformer they put into their new predictive text keyboard feature.
 
  • Like
Reactions: ipedro

ipedro

macrumors 603
Original poster
Nov 30, 2004
6,239
8,508
Toronto, ON
While I agree that Siri could use a little LLM brain transplant, I doubt that Apple will do it as soon as this September. Such a task will take some time and Apple will want to experiment with LLM-style AI before they commit it to such a high profile feature. A small-scale experiment like the Transformer they put into their new predictive text keyboard feature.

Unlikely by September, but almost certainly by early next year ahead of the Vision Pro launch. If Apple's massive bet on its future which relies on Siri is released out into the wild and performs as badly as Siri does today next to the impressive human-like conversational chatGPT and Google Bard, Apple is going to have a serious PR problem. They'd be better off pulling Siri from Vision Pro if it's not ready and then adding it in a future update.
 
  • Like
Reactions: Tagbert

SB1500

Suspended
Dec 31, 2021
147
104
Apple only needs to make AI work for you and me, on our Apple devices. Unlike Google, Bing, etc, it isn't trying to deliver the next big thing for commercial / industrial / businesses. It's bound to be experimenting and toying with this internally, but it might be years before we see the fruits of that. I expect Apple will be far from the first, but most likely the best - and in this case, most trustworthy model. That's what I hope for anyway, what's the rush?
 
  • Like
Reactions: LinusR

Fraserpatty

macrumors 6502
Mar 5, 2015
343
300
Unlikely by September, but almost certainly by early next year ahead of the Vision Pro launch. If Apple's massive bet on its future which relies on Siri is released out into the wild and performs as badly as Siri does today next to the impressive human-like conversational chatGPT and Google Bard, Apple is going to have a serious PR problem. They'd be better off pulling Siri from Vision Pro if it's not ready and then adding it in a future update.
I thought this would happen when the watch came out. Far too small a screen to be interacting with in any other way. other than Siri I thought. They will make Siri completely awesome so that it can be properly used with the watch. Boy was I wrong.
 

WebHead

macrumors 6502
Dec 29, 2004
441
98
I've seen it said that with the rise of AI "the future is software". What does that mean exactly, and if true what does it mean for a hardware company like Apple?

(I know Apple are a software company too, but their business model uses software to sell high-margin hardware.)
 

AgentPotato

macrumors newbie
Apr 3, 2017
11
1
Los Gatos, CA
Is seems to me that Apple currently (with Apple silicon RAM limitations and GPU support) has no strategy for direct LLM AI training and development on the mac platform. Do you think this will change or languish? I hope it doesn't go the way of 3D work on the mac (i.e. never developed adequately with solid hardware and software support)...
 

ipedro

macrumors 603
Original poster
Nov 30, 2004
6,239
8,508
Toronto, ON
Is seems to me that Apple currently (with Apple silicon RAM limitations and GPU support) has no strategy for direct LLM AI training and development on the mac platform. Do you think this will change or languish? I hope it doesn't go the way of 3D work on the mac (i.e. never developed adequately with solid hardware and software support)...

The Neural Engine deployed in the iPhone X that has since become a commodity, can train LLMs.

Think out of the box for a moment. Powerful processing doesn't have to happen on an individual device level. Apple's strategy shifted some time in the early 2010's when they found it necessary to design their own silicon, integrate machine learning into each of its chips in a neural engine, as well as low power ultra wide band chips for nearby networking and has begun deploying all of these components even in the smallest, most affordable devices that can't even use them to their fullest potential.

The more you back up and look at the greater picture, the more it's becoming apparent that Apple is building a mesh neural network. Information relevant to the individual user is processed on device for privacy, but a collective model can be processed in a mesh, first in users that know one another and perhaps share iCloud Photo albums or exchange a file through AirDrop, to eventually total strangers waiting at the bus stop together.

There's already a precedent for this where our iPhones are being used to locate other people's AirTags. There are billions of iPhones, Macs, iPads, and Apple TVs out there (and they'll no doubt make their way into Apple Watches, AirPods and Vision Pros) Entire large cities will become LLM processors.
 

ipedro

macrumors 603
Original poster
Nov 30, 2004
6,239
8,508
Toronto, ON
Apple has no doubt built an incredibly impressive groundwork for machine learning and AI applications but it's inexplicable how they've let Siri languish for so long.

That said, we can definitely see that chatGPT's emergence into public awareness has lit a fire under Tim Cook's åss, in a way that Apple is being uncharacteristically open stating their intentions for Generative AI to buy themselves time. To the point I initiated this thread on, the entire future of Apple could rest on getting this right – and doing it before conversational computing becomes an expected user interface as crucial as multi touch displays.
 

Abazigal

Contributor
Jul 18, 2011
19,637
22,140
Singapore
That said, we can definitely see that chatGPT's emergence into public awareness has lit a fire under Tim Cook's åss, in a way that Apple is being uncharacteristically open stating their intentions for Generative AI to buy themselves time. To the point I initiated this thread on, the entire future of Apple could rest on getting this right – and doing it before conversational computing becomes an expected user interface as crucial as multi touch displays.
I personally doubt that conversational AI will ever be as efficient as interacting with the display directly. There's a reason why virtually nobody used the amazon echo to order stuff online, and chatbots are treated with disdain. There was way too much friction and too many things that could go wrong in trying to navigate a menu via voice control that it was simply safer and faster to order using your phone.

I find that AI continues to suffer from a design problem. Right now, nearly all of the attention on AI is on the core technology and not on how people are supposed to interact with it (see the recently announce humane AI pin as an example).

Third, am I the only one who feels that some of the proposed use cases of AI sound downright dystopian, even as people here are cheering the competition on and ragging on Apple for supposedly being behind in the AI race? For example, the idea that people can simply tell AI what to write or draw (the implication is that humanity may one day not need to create anything ever due to the prevalence of AI) does not sound like a future I want to be a part of. Back when I was in school, if I tried to pass off someone else's work as my own, that was considered cheating and I would receive a failing mark there and then. Yet here, we are celebrating AI limiting human creativity.

Simply put, I feel the crux here is not in who has the better underlying tech, but in who can better package it for the mass market. There is a major opportunity for Apple to add its interpretation to how we should engage with AI and use the technology to enhance our creativity, rather than replace it altogether.

A big reason Apple is where it is today is that Apple believes that technology is too powerful of a force to enjoy without acquired perception and natural intelligence. That philosophy is needed more now than ever before.

For what it's worth, I do also hope that Apple gets it right, because I worry more for the future of the human race than I do for the future of Apple if they don't.
 
  • Like
Reactions: Chuckeee and LinusR

AgentPotato

macrumors newbie
Apr 3, 2017
11
1
Los Gatos, CA
The Neural Engine deployed in the iPhone X that has since become a commodity, can train LLMs.

Think out of the box for a moment. Powerful processing doesn't have to happen on an individual device level. Apple's strategy shifted some time in the early 2010's when they found it necessary to design their own silicon, integrate machine learning into each of its chips in a neural engine, as well as low power ultra wide band chips for nearby networking and has begun deploying all of these components even in the smallest, most affordable devices that can't even use them to their fullest potential.

The more you back up and look at the greater picture, the more it's becoming apparent that Apple is building a mesh neural network. Information relevant to the individual user is processed on device for privacy, but a collective model can be processed in a mesh, first in users that know one another and perhaps share iCloud Photo albums or exchange a file through AirDrop, to eventually total strangers waiting at the bus stop together.

There's already a precedent for this where our iPhones are being used to locate other people's AirTags. There are billions of iPhones, Macs, iPads, and Apple TVs out there (and they'll no doubt make their way into Apple Watches, AirPods and Vision Pros) Entire large cities will become LLM processors.
This is definitely an interesting idea. I'm just lamenting that we are unable to experiment and develop effectively on the mac platform.
 

Flowstates

macrumors regular
Aug 5, 2023
227
261
This is definitely an interesting idea. I'm just lamenting that we are unable to experiment and develop effectively on the mac platform.

I did quite a bit of experimenting with LLMs on Msilicon through a project called Ollama, the decode speeds are decent.
 

Bokka

macrumors member
Jan 19, 2018
41
5
Based on the latest news and rumors, there is a concrete chance to see something like Google AI in the Pixel 8 devices and bing/copilot in Windows 11 OS on the next year releases of IOS 18 and macOS 15?

And there is a chance to see it in the actual devices like iPhone 14 or MacBook air M1/M2?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.