this post was submitted on 16 Aug 2024
1 points (100.0% liked)

TechTakes

1491 readers
8 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Looks like it might actually have happened.

top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 4 months ago

Maybe I'm old fashioned but,

I still start by asking someone who knows about the thing what books they might recommend. And I know mushrooms are especially problematic, so I go look for um, active communities of people who aren't dead from eating the wrong mushrooms.

Is it possible, that we're looking too far away from accountable sources when we route our knowledge searches through noisy corporate slops?

[–] [email protected] 0 points 4 months ago (2 children)

I refuse to lend any credibility at all to some anonymous controversial reddit post made from a throwaway. There are way too many of that nowadays and I'd bet a toonie a large proportion of those are either faked to drive engagement or for shits and giggles.

Besides, this is the internet... Didn't anybody get the lesson with the bonsai kittens?!

[–] [email protected] 0 points 4 months ago (1 children)
[–] [email protected] 0 points 4 months ago (1 children)

Yeah sorry about that, a toonie is 2 Canadian dollars.

[–] [email protected] 0 points 4 months ago

not a problem, asked because I was curious :)

[–] [email protected] 0 points 4 months ago

Throwaways are the standard for the legal advice subreddits, and some are clearly fiction. This one sounds legit

[–] [email protected] 0 points 4 months ago* (last edited 4 months ago)

lol@OP not mentioning it's from Amazon. I bet posting something negative about a potential reddit business partner would get their account nuked.

[–] [email protected] 0 points 4 months ago (1 children)

If this turns out to be real, I suspect its gonna be a major shitshow - not only for the publisher, but for the AI industry as a whole.

For the publisher, they're gonna be lambasted for endangering people's lives for a quick AI-printed buck.

For AI, its gonna be yet another indictment of an industry that's seen fit to put technology, profits, basically everything over human lives - whether in the "AI Safety" criti-hype which implicitly suggests culpability for bringing about an apocalypse straight out of sci-fi, or in the myriad ways they are making the world worse right now.

[–] [email protected] 0 points 4 months ago (1 children)

If it turns out to be read. I have my doubts.

[–] [email protected] 0 points 4 months ago (1 children)

Update: Whilst the the story's veracity remains unconfirmed as of this writing, it has gone on to become a shitshow for the AI industry anyways - turns out the story got posted on Twitter and proceeded to go viral.

Assuming its fabricated, I suspect OP took their cues from this 404 Media report made a year ago, which warned about the flood of ChatGPT-generated mycology books and their potentially fatal effects.

As for people believing it, I'm not shocked - the AI bubble has caused widespread harm to basically every aspect of society, and the AI industry is viewed (rightfully so, I'd say) as having willingly caused said harm by developing and releasing AI systems, and as utterly unrepentant about it.

Additionally, those who use AI are viewed (once again, rightfully so) as unrepentant scumbags of the highest order, entirely willing to defraud and hurt others to make a quick buck.

With both those in mind, I wouldn't blame anyone for immediately believing it.

[–] [email protected] 0 points 4 months ago (1 children)

I usually dislike the whole line of thinking of "Well, it might not be true, but it tells you something that you believed it."

But, the world in which AI succeeds is the world where every book published is a fake field guide to mushrooms, or a recipe book for shaving cream. And it's like... I dunno, after 4 years of happily proclaiming that this is the thing we're going to sell, why have these guys never considered that fraud is bad, actually. Is fully automated luxury gay space fraud really so enticing?

[–] [email protected] 0 points 4 months ago

I dunno, after 4 years of happily proclaiming that this is the thing we’re going to sell, why have these guys never considered that fraud is bad, actually. Is fully automated luxury gay space fraud really so enticing?

It is if you expect to make an absolute crapload of money off of it and have absolutely zero soul. And, well, its the AI industry - anyone with a soul probably left a couple years ago.

[–] [email protected] 0 points 4 months ago (2 children)

My wife just received an email from the online retailer. She has been asked to "Not take any photographs or copies of the product in question due to copyright issues" and it states, "the product must be returned immediately by special delivery by [DATE]." There's some other statements as well about our account being terminated if we fail to return the product by the specific date. We've got a lot of movies and series that we have purchased over the years on this account, I wouldn't want to lose them.

They've stated that taking photos of the book will break a bunch of laws. They also stated that not promptly returning the book would may break some laws, and lead to the termination of my account.

Does the CEO of Totally-Not-Amazon know that his company is signing it's name to stupid letters?

[–] [email protected] 0 points 4 months ago

I could easily see the vendor on Amazon (which is sent/replied to through Amazon addresses for transparency/documentation), sending this nonsense.

[–] [email protected] 0 points 4 months ago

keeping this thing you bought is illegal

Wouldn't even surprise me if that's true, with today's batshit insane corporate-written laws.

[–] [email protected] 0 points 4 months ago

I'd be too pissed off to take it to friggin reddit. I don't buy it.