this post was submitted on 25 Mar 2024
328 points (97.1% liked)
World News
32301 readers
442 users here now
News from around the world!
Rules:
-
Please only post links to actual news sources, no tabloid sites, etc
-
No NSFW content
-
No hate speech, bigotry, propaganda, etc
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Or it's ChatGPT
Fuck you’re probably right
Honestly, I think ChatGPT wouldn't make that particular mistake. Sounding proper is its primary purpose. Maybe a cheap knockoff.
chatGPT just guesses the next word. stop anthropomorphizing it.
Lol making a mistake isn't unique to humans. Machines make mistakes.
Congratulations for knowing that a LLM isn't the same as a human though, I guess!
I knew someone would say that.
it guesses the next word... based on examples created by humans. It's not just making shit up out of thin air.
Humans are just electrified meat. Stop anthropomorphizing it.
🙄
Another example of why I hate techies
Found Andrew Ure's account
Yes, it does that because it was designed to sound convincing, and that is a good method for accomplishing that. That is the primary goal behind the design of all chatbots, and what the Turing Test was intended to gauge. Anyone who makes a chatbot wants it to sound good first and foremost.
TalkFOS