this post was submitted on 05 Jun 2025
592 points (97.6% liked)

People Twitter

7251 readers
790 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 109 points 2 days ago (7 children)

it only takes a couple times of getting a made-up bullshit answer from chatgpt to learn your lesson of just skip asking chatgpt anything altogether

[–] [email protected] 2 points 1 day ago

I feel like a lot of people in this community underestimate the average person's willingness to trust an AI. Over the past few months, every time I've seen a coworker ask something and search it up, I have never seen them click on a website to view the answer. They'll always take what the AI summary tells them at face value

Which is very scary

[–] [email protected] 1 points 2 days ago

That's what people get when they ask me questions too but they still bother me all the time so clearly that's not going to work.

[–] [email protected] 20 points 2 days ago (2 children)

But chatgpt always gives such great answers on topics I know nothing at all about!

[–] [email protected] 2 points 1 day ago

Oh yeah, AI can easily replace all the jobs I don't understand too!

[–] [email protected] 3 points 2 days ago

Gell-mann amnesia. Might have to invent a special name for the AI flavour of it.

[–] [email protected] 2 points 2 days ago

I've only really found it useful when you provide the source of information/data to your prompt. E.g. say you want to convert one data format to another like table data into JSON

It works very consistently in those types of use cases. Otherwise it's a dice roll.

[–] [email protected] 13 points 2 days ago (3 children)

I was using it to blow through an online math course I'd ultimately decided I didn't need but didn't want to drop. One step of a problem I had it solve involved finding the square root of something; it spat out a number that was kind of close, but functionally unusable. I told it it made a mistake three times and it gave a different number each time. When I finally gave it the right answer and asked, "are you running a calculation or just making up a number" it said that if I logged in, it would use real time calculations. Logged in on a different device, asked the same question, it again made up a number, but when I pointed it out, it corrected itself on the first try. Very janky.

[–] [email protected] 4 points 2 days ago

So it forced you to ask it many times? Now imagine that you paid for it each time. For the creator then, mission fucking accomplished.

[–] [email protected] 11 points 2 days ago (1 children)

ChatGPT doesn't actually do calculations. It can generate code that will actually calculate the answer, or provide a formula, but ChatGPT cannot do math.

[–] [email protected] 2 points 2 days ago

It's just like me fr fr

[–] [email protected] 3 points 2 days ago

You need multi-shot prompting when it comes to math. Either the motherfucker gets it right, or you will not be able to course correct it in a lot of cases. When a token is in the context, it's in the context and you're fucked.

Alternatively you could edit the context, correct the parameters and then run it again.

On the other side of the shit aisle

Shoutout to my man Mistral Small 24B who is so insecure, it will talk itself out of correct answers. It's so much like me in not having any self worth or confidence.

[–] [email protected] 33 points 2 days ago (3 children)

I stopped using it when I asked who I was and then it said I was a prolific author then proceeded to name various books I absolutely did not write.

[–] [email protected] 6 points 2 days ago (1 children)

Why the fuck would it know who you are?

[–] [email protected] 13 points 2 days ago (1 children)

If you have an account, you can tell it things about yourself. I used my boss's account for a project at work (felt gross). I made the mistake of saying "good morning" to it one day, and it proceeded to ask me if I was going to do (activities related to my boss's personal life - and the details were accurate). I was thinking, "why does he tell it so much about himself?"

[–] [email protected] 2 points 2 days ago

So it's working as intended.

[–] [email protected] 3 points 2 days ago

and I'm apparently a famous Tiktoker and Youtuber.

[–] [email protected] 14 points 2 days ago

I just read "The Autobiography of QueenHawlSera"!
Have I been duped?