Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Frantisekj

macrumors 6502a
Mar 9, 2017
559
376
Deep inside Europe :-)
Perhaps it's my age, or perhaps it's my life experience, but I don't want LLM-AI in my life. It's too dependent on the perspective of its trainers. But I will let it start my coffeemaker in the morning.
From my short experience AI is now mainly trained by curators to give "curated" answers and keep us in mental prison we are in now. Twice I managed after long fiddling to get unbiased answer but then was logged of immediately. It has never happened in other any occasion.
 

ipaqrat

macrumors 6502
Mar 28, 2017
302
325
From my short experience AI is now mainly trained by curators to give "curated" answers and keep us in mental prison we are in now. Twice I managed after long fiddling to get unbiased answer but then was logged of immediately. It has never happened in other any occasion.
Hugo Weaving was not amused.
1703260546991.png
 
  • Like
Reactions: AlanMarron

ApplesAreSweet&Sour

macrumors 68000
Sep 18, 2018
1,942
3,551
Apple is so clever to avoid adding more ram 😉
There’s no way Apple is giving this “Apple GPT” to devices older than the latest and greatest (whenever AI gets implemented in iOS).

And at that time, I don’t think any but the lowest end Apple devices will have less than 6GB RAM.

Holding out on RAM is one of the best ways to ensure we’ll keep upgrading even though most other specs on iPhones are more than capable enough to run the latest apps and OS.

Planned obsolescence but without having to tinker with the OS to cripple performance on older devices.
 
  • Like
Reactions: Razorpit

obviouslogic

macrumors 6502
Mar 23, 2022
266
423
Maybe they should just double the RAM? Adding another 8 GB should be like $20.

I know, why not both.

Nothing to do with cost… More RAM means more power usage. Mobile devices are optimized for efficiency. In Apple’s case, they take those optimizations to the next level in all areas of the device to maximize efficiency, while still taking performance in consideration when and where needed. Their SoC designs are a testament to their dedication to this pursuit.

Simply adding more RAM may be the easiest and most obvious method of handling the issue, but doing so limits the targets of where they may want to add this ability. What if their end goal is to add these abilities to say a watch? You can’t just cram X amount of RAM into every device and call it a day. Working through perceived limitations is what enables these things to move into areas where it was thought they never could. Most advancements are made not by working around limitations by throwing “more” into the equation, but by thinking of ways to overcome them using the finite resources given.

Unlike most other companies Apple takes the difficult route of limiting their resources when designing and developing systems. This forces them to think of novel ways to approach and overcome issues. This has enabled them to create a platform and architecture that scales from a watch all the way to a desktop computer.
 

MilaM

macrumors 6502a
Nov 7, 2017
727
1,577
Unlike most other companies Apple takes the difficult route of limiting their resources when designing and developing systems.
This is a sound design decision, until it isn't anymore. If llms and similar technologies go mainstream, there will be no way around increasing RAM. The other way to solve this is to offload processing to the cloud. But I doubt this would be desireble long term. Creating the capability in the cloud will be very expensive and someone will have to pay for it. Most SOC on Apple devices are more than capable to process large models (watches and smart speakers are probably an excpetion). RAM size is the biggest bottleneck right now in my opinion on iPhones, iPads and all regular Macs.
 

Apple_Robert

Contributor
Sep 21, 2012
34,541
50,153
In the middle of several books.
What most people don't understand about Siri (or even Apple design) is that Siri doesn't give you what you want. She gives you what you need. She's not so much an assistant as she is a wise omnipotent technological resource. Next time she doesn't answer the way you expect her to, think about why, go deeper than the surface of what she said. You will likely find something deeper than you could have ever imagined.
Sounds like a girlfriend instead of Siri.
 
  • Haha
Reactions: parameter

amartinez1660

macrumors 68000
Sep 22, 2014
1,591
1,628
This is very neat research, but I think the suggestion from the headline, that this will bring LLMs to iPhones, is a stretch. The test platform was a M1 Max with an unspecified amount of DRAM. My educated guess is that it had at least 16GB of DRAM, likely even more. This is not iPhone territory.

I said it before and will repeat it now. Making RAM a luxury with extremely overpriced upgrades will bite Apple, or rather us customers, in the ass.
True that on the price mark up, but in any case, it is my understanding that RAM as of today eats battery.
 

Razorpit

macrumors 65816
Feb 2, 2021
1,109
2,352
That doesn't work for you? I ask Siri to "call my wife" regularly. I did it earlier this week.
Siri says I need to add that information to my contact, but I have. In years past it worked flawlessly. About 2-3 years it started acting wonky, then it stopped completely. I've even asked who my wife is, it shows the contact, then later when I go to use it in the, car it forgets. Might be some weird CarPlay thing.

What do you mean? I can still do that with mine. I call your wife all the time.
😆
 

Contact_Feanor

macrumors 6502
Jun 7, 2017
252
762
Belgium
That’s an easy fix,
“Hey Siri, Tom is my husband.”
🤣
I've tried that multiple times. I've tried deleting and re-adding his contact information. I've tried manually adding the "husband"/spouse tag. I've told siri he is my husband, I told his name when siri said she doesn't know, I've tried everything. I'd be happy is there was an easy fix, but as indicated here in the thread: there's a bug preventing it from working.
 
  • Love
Reactions: Razorpit

mrBeach

macrumors newbie
Jun 16, 2023
16
8
🤣
I've tried that multiple times. I've tried deleting and re-adding his contact information. I've tried manually adding the "husband"/spouse tag. I've told siri he is my husband, I told his name when siri said she doesn't know, I've tried everything. I'd be happy is there was an easy fix, but as indicated here in the thread: there's a bug preventing it from working.
Sorry, didn’t know you had tried that. This might help, it suggests that problem might occur due to exchange account - and has a solution where you manually add relationships to your own contact card instead,
 
  • Like
Reactions: Razorpit

dgdosen

macrumors 68030
Dec 13, 2003
2,769
1,407
Seattle
I hope competition throws cheap GBs of memory in their phones - thus better enabling LLMs on other platforms. Force Apple to bump up memory everywhere.
 
  • Like
Reactions: Razorpit

Razorpit

macrumors 65816
Feb 2, 2021
1,109
2,352
That’s an easy fix,
“Hey Siri, Tom is my husband.”
I wish it were that easy. 😄

🤣
I've tried that multiple times. I've tried deleting and re-adding his contact information. I've tried manually adding the "husband"/spouse tag. I've told siri he is my husband, I told his name when siri said she doesn't know, I've tried everything. I'd be happy is there was an easy fix, but as indicated here in the thread: there's a bug preventing it from working.
Yup, same here, many times as a matter of fact.

Sorry, didn’t know you had tried that. This might help, it suggests that problem might occur due to exchange account - and has a solution where you manually add relationships to your own contact card instead,
That’s a great idea, never thought of that. However it suddenly started working for me again! I haven’t tried “a fix” in a while, resorting to “Call XXXX on mobile.” Of course every once in a while a “Call my wife…” would slip out. One time it finally worked again, about 3 weeks ago, and has continued ever since!
 

Fraserpatty

macrumors 6502
Mar 5, 2015
355
322
Has anybody else had trouble when sending text messages with Siri getting her to read it back to you before sending? I know that there is a way to do that if that is what you want to do, but Siri has always been untrustworthy enough that I would never send a message without confirming it first. But now, when I tell her to read it back to me she stops in mid message. She is also stopping halfway through reading my new notifications. I wonder what gives and how to fix it
 

Razorpit

macrumors 65816
Feb 2, 2021
1,109
2,352
Has anybody else had trouble when sending text messages with Siri getting her to read it back to you before sending? I know that there is a way to do that if that is what you want to do, but Siri has always been untrustworthy enough that I would never send a message without confirming it first. But now, when I tell her to read it back to me she stops in mid message. She is also stopping halfway through reading my new notifications. I wonder what gives and how to fix it
Siri is terrible with CarPlay right now. She’ll ask if I want to reply to a text and will send “No” when I don’t. If i don't say anything then she’ll want to send a text with just the persons name.
 

nwiggin

macrumors newbie
Apr 30, 2024
1
1
The combination of these methods allows AI models to run up to twice the size of the iPhone's available memory, according to the paper.
...
Apple is reportedly developing its own generative AI model called "Ajax". Designed to rival the likes of OpenAI's GPT-3 and GPT-4, Ajax operates on 200 billion parameters
Assuming 8-bit quantization, for 200b parameters you would need over 100gb of ram. Will the next iPhone have 128gb ram??
 
  • Like
Reactions: VulchR

purplerainpurplerain

macrumors 6502a
Dec 4, 2022
567
1,123
Assuming 8-bit quantization, for 200b parameters you would need over 100gb of ram. Will the next iPhone have 128gb ram??

You can fill the RAM but your processor and battery is still being inundated by an LLM. An LLM has no place on a phone. It’s bad enough in a MacBook Pro.

As an example I have ran all the open source LLMs on M3 Max. They chew through the battery with the worst offender being Command-R which can deplete the whole battery in less than an hour. Even a Blender render is less stressful.

On a phone you need specialized language models that don’t need to ‘know it all’. A small size model can do many things without needing to ‘know it all’.

And you don’t need an LLM to switch to dark mode or set an alarm. That’s just voice commands which have been around since OS 9.1 classic.
 
  • Like
Reactions: nwiggin

seek3r

macrumors 68020
Aug 16, 2010
2,303
3,291
You can fill the RAM but your processor and battery is still being inundated by an LLM. An LLM has no place on a phone. It’s bad enough in a MacBook Pro.
Not that I think we really need massive LLMs on our phones, but that's what beefing up the neural processor, reportedly a major focus of the M4 should be good for. Running such things on a part of the silicon engineered for it is a lot more power efficient than on the general part of the CPU
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.