And you don't need that for everything. LLM's with 7 billion parameters are about 4GB and perform fine when tuned.GPT-4 has 1.73 *trillion* parameters. That would be ~x10 times Apple’s LLM.
And you don't need that for everything. LLM's with 7 billion parameters are about 4GB and perform fine when tuned.GPT-4 has 1.73 *trillion* parameters. That would be ~x10 times Apple’s LLM.
From my short experience AI is now mainly trained by curators to give "curated" answers and keep us in mental prison we are in now. Twice I managed after long fiddling to get unbiased answer but then was logged of immediately. It has never happened in other any occasion.Perhaps it's my age, or perhaps it's my life experience, but I don't want LLM-AI in my life. It's too dependent on the perspective of its trainers. But I will let it start my coffeemaker in the morning.
Hugo Weaving was not amused.From my short experience AI is now mainly trained by curators to give "curated" answers and keep us in mental prison we are in now. Twice I managed after long fiddling to get unbiased answer but then was logged of immediately. It has never happened in other any occasion.
There’s no way Apple is giving this “Apple GPT” to devices older than the latest and greatest (whenever AI gets implemented in iOS).Apple is so clever to avoid adding more ram 😉
Maybe they should just double the RAM? Adding another 8 GB should be like $20.
I know, why not both.
Yep it’s crazy how they’ll do anything but add moreApple is so clever to avoid adding more ram 😉
It's not crazy, it's just good for the bottom line and for bonuses.Yep it’s crazy how they’ll do anything but add more
This is a sound design decision, until it isn't anymore. If llms and similar technologies go mainstream, there will be no way around increasing RAM. The other way to solve this is to offload processing to the cloud. But I doubt this would be desireble long term. Creating the capability in the cloud will be very expensive and someone will have to pay for it. Most SOC on Apple devices are more than capable to process large models (watches and smart speakers are probably an excpetion). RAM size is the biggest bottleneck right now in my opinion on iPhones, iPads and all regular Macs.Unlike most other companies Apple takes the difficult route of limiting their resources when designing and developing systems.
Sounds like a girlfriend instead of Siri.What most people don't understand about Siri (or even Apple design) is that Siri doesn't give you what you want. She gives you what you need. She's not so much an assistant as she is a wise omnipotent technological resource. Next time she doesn't answer the way you expect her to, think about why, go deeper than the surface of what she said. You will likely find something deeper than you could have ever imagined.
True that on the price mark up, but in any case, it is my understanding that RAM as of today eats battery.This is very neat research, but I think the suggestion from the headline, that this will bring LLMs to iPhones, is a stretch. The test platform was a M1 Max with an unspecified amount of DRAM. My educated guess is that it had at least 16GB of DRAM, likely even more. This is not iPhone territory.
I said it before and will repeat it now. Making RAM a luxury with extremely overpriced upgrades will bite Apple, or rather us customers, in the ass.
Yep it’s crazy how they’ll do anything but add more
Siri says I need to add that information to my contact, but I have. In years past it worked flawlessly. About 2-3 years it started acting wonky, then it stopped completely. I've even asked who my wife is, it shows the contact, then later when I go to use it in the, car it forgets. Might be some weird CarPlay thing.That doesn't work for you? I ask Siri to "call my wife" regularly. I did it earlier this week.
😆What do you mean? I can still do that with mine. I call your wife all the time.
That’s an easy fix,With my husband it stopped working a couple of years ago as well. She just claims not to know who my husband is, even though it’s indicated in the contacts app.
🤣That’s an easy fix,
“Hey Siri, Tom is my husband.”
And hopefully not just on the newest iPhones and devices. We have all waited long enough.These techniques should be helpful for all AI and AI platforms, not just on phones
Sorry, didn’t know you had tried that. This might help, it suggests that problem might occur due to exchange account - and has a solution where you manually add relationships to your own contact card instead,🤣
I've tried that multiple times. I've tried deleting and re-adding his contact information. I've tried manually adding the "husband"/spouse tag. I've told siri he is my husband, I told his name when siri said she doesn't know, I've tried everything. I'd be happy is there was an easy fix, but as indicated here in the thread: there's a bug preventing it from working.
I wish it were that easy. 😄That’s an easy fix,
“Hey Siri, Tom is my husband.”
Yup, same here, many times as a matter of fact.🤣
I've tried that multiple times. I've tried deleting and re-adding his contact information. I've tried manually adding the "husband"/spouse tag. I've told siri he is my husband, I told his name when siri said she doesn't know, I've tried everything. I'd be happy is there was an easy fix, but as indicated here in the thread: there's a bug preventing it from working.
That’s a great idea, never thought of that. However it suddenly started working for me again! I haven’t tried “a fix” in a while, resorting to “Call XXXX on mobile.” Of course every once in a while a “Call my wife…” would slip out. One time it finally worked again, about 3 weeks ago, and has continued ever since!Sorry, didn’t know you had tried that. This might help, it suggests that problem might occur due to exchange account - and has a solution where you manually add relationships to your own contact card instead,
Siri won't add relationships? Here's a tip.
In a recent iMore article, we were told how to let Siri know about your relationships to your contacts. That way, you could tell Siri to call your mom or wife without having to say their names*. Unfortunately, some of us have found that Siri will not remember our contacts' relationships to us...forums.imore.com
Siri is terrible with CarPlay right now. She’ll ask if I want to reply to a text and will send “No” when I don’t. If i don't say anything then she’ll want to send a text with just the persons name.Has anybody else had trouble when sending text messages with Siri getting her to read it back to you before sending? I know that there is a way to do that if that is what you want to do, but Siri has always been untrustworthy enough that I would never send a message without confirming it first. But now, when I tell her to read it back to me she stops in mid message. She is also stopping halfway through reading my new notifications. I wonder what gives and how to fix it
Assuming 8-bit quantization, for 200b parameters you would need over 100gb of ram. Will the next iPhone have 128gb ram??The combination of these methods allows AI models to run up to twice the size of the iPhone's available memory, according to the paper.
...
Apple is reportedly developing its own generative AI model called "Ajax". Designed to rival the likes of OpenAI's GPT-3 and GPT-4, Ajax operates on 200 billion parameters
Assuming 8-bit quantization, for 200b parameters you would need over 100gb of ram. Will the next iPhone have 128gb ram??
Not that I think we really need massive LLMs on our phones, but that's what beefing up the neural processor, reportedly a major focus of the M4 should be good for. Running such things on a part of the silicon engineered for it is a lot more power efficient than on the general part of the CPUYou can fill the RAM but your processor and battery is still being inundated by an LLM. An LLM has no place on a phone. It’s bad enough in a MacBook Pro.