this post was submitted on 17 Sep 2024
123 points (94.9% liked)

science

14786 readers
59 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

<--- rules currently under construction, see current pinned post.

2024-11-11

founded 1 year ago
MODERATORS
 

From the article:

This chatbot experiment reveals that, contrary to popular belief, many conspiracy thinkers aren't 'too far gone' to reconsider their convictions and change their minds.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 2 months ago (1 children)

"Great! Billy doesn't believe 9/11 was an inside job, but now the AI made him believe Bush was actually president in 1942 and that Obama was never president."

In all seriousness I think an "unbiased" AI might be one of the few ways to reach people about this stuff because any Joe schmoe is just viewed as "believing what they want you to believe!" when they try to confront any conspiracy.

[–] [email protected] 5 points 1 month ago (1 children)

With the inherent biases present in any LLM training model, the issue of hallucinations that you've brought up, alongside the cost of running an LLM at scale being prohibitive to anyone besides private-state partnerships, do you think that will allay conspiracists' valid concerns about the centralization of information access, a la the reduction in quality google search results over the past decade and a half?

[–] [email protected] 3 points 1 month ago

I think those people might not, but I was once a "conspiracy nut," had a circle of friends who were as well, and know that for a lot of those kinds of people YouTube is the majority of the "research" they do. For those people I think this could work as long as it's not hallucinating and can point to proper sources.