this post was submitted on 17 Sep 2024
123 points (94.9% liked)

science

14786 readers
59 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

<--- rules currently under construction, see current pinned post.

2024-11-11

founded 1 year ago
MODERATORS
 

From the article:

This chatbot experiment reveals that, contrary to popular belief, many conspiracy thinkers aren't 'too far gone' to reconsider their convictions and change their minds.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 1 month ago (2 children)

If they're gullible enough to be suckered into it, they can similarly be suckered out of it - but clearly the effect would not be permanent.

[–] [email protected] 1 points 1 month ago (1 children)

I've always believed the adage that you can't logic someone out of a position they didn't logic themselves into. It protects my peace.

[–] [email protected] 1 points 1 month ago

logic isn't the only way to persuade, in fact all evidence seems to show it works on very few people.

Everyone discounts sincere emotional arguments but frankly that's all I've ever seen work on conspiracyheads.

[–] [email protected] 2 points 1 month ago (2 children)

That doesn’t follow with the “if you didnt reason your way into a believe you can’t reason your way out” line. Considering religious ferver I’m more inclined to believe this line than yours.

[–] [email protected] 5 points 1 month ago

No one said at all that AI used "reason" to talk people out of a conspiracy theory. In fact I would assume it's incredibly unlikely since AI in general is not reasonable.

[–] [email protected] 2 points 1 month ago

Why? It works as a corollary - there's no logic involved in any of the stages described.