this post was submitted on 06 Aug 2024
78 points (96.4% liked)

Apple

605 readers
61 users here now

There are a couple of community rules in addition to the main instance rules.

All posts must be about Apple

Anything goes as long as it’s about Apple. News about other companies and devices is allowed if it directly relates to Apple.

No NSFW content

While lemmy.zip allows NSFW content this community is intended to be a place for all to feel welcome. Any NSFW content will be removed and the user banned.

If you have any comments or suggestions please message one of the moderators.

founded 1 year ago
MODERATORS
 

Long lists of instructions show how Apple is trying to navigate AI pitfalls.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 65 points 3 months ago (16 children)

You can't tell an LLM to not hallucinate, that would require it to actually understand what it's saying. "Hallucinations" are just LLMs bullshitting, because that's what they do. LLMs aren't actually intelligent they're just using statistics to remix existing sentences.

[–] [email protected] 28 points 3 months ago (10 children)

I wish people would say machine learning or LLMs more frequently instead of AI being the buzzword. It really irks me. IT'S NOT ACCURATE! THAT'S NOT WHAT IT IS! STOP DEMEANING TRUE MACHINE CONSCIOUSNESS!

[–] [email protected] 3 points 3 months ago

Gotta let the people know they're getting scammed with false advertising

load more comments (9 replies)
load more comments (14 replies)