30
Apple Intelligence features reportedly won’t be ready for iOS 18’s launch this fall
(arstechnica.com)
to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!
Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple
Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode
Community banner courtesy of u/Antsomnia.
I’m willing to bet they rushed that announcement out the door without anything remotely close to ready, in order to get on the bandwagon and avoid shareholder ire. Apple is usually above such silliness, I am dissapoint.
My guess is they thought they were 99% done but that the 1% (“just gotta deal with these edge case hallucinations”) ended up requiring a lot more work (maybe even an entirely new sub-system or a wholly different approach) than anticipated.
I know I suggested the issue might be hallucinations above, but what I’m genuinely curious about is how they plan to have acceptable performance without losing half or more of your usable RAM to the model.
Will it run locally? I just assumed it would be run on Apple servers in some way.
They framed it like most of the stuff is running on device while some in some cases, I suppose image generations, it will use the "very secure" apple servers. Additionally appleAI can decide that it would make sense to ask chatGPT on their servers and gives you the option to do so.
I thought they had confirmed at least some of the image generation stuff happening locally. I am in the intelligence beta now and went offline and played around with Siri and lots of stuff worked. Not really doing much new right now, but the speed and quality of understanding and dealing with when you stumble over words are way better.
Nice 😃
Locally but there's also an option to use OpenAI's API, I believe.
Ok, then I’m also curious on how they would solve that.