this post was submitted on 24 Jun 2024
1 points (100.0% liked)

TechTakes

1432 readers
16 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 5 months ago (49 children)

It's not so much about the doomers being sure that AGI will lead to human extinction (or worse) The point is that even if the chances of it are extremely slim, the consequences can be worse than we're even capable of imagining. The question is: do we really want to take that chance?

It's kind of like with the trinity nuclear test. Scientists were almost 100% confident that it wont cause a chain reaction that sets the entire atmosphere on fire but when we're speaking about the future of the entire humanity I don't blame people for arguing that almost 100% certainty is not good enough.

Why when we look into the stars do we not see a sign of life anywhere else? Has life not emerged yet or has it wiped itself out? With what? Nukes? AI? Synthetic viruses made with AI? Who knows..

Personally I think that stopping AI recearch is not an option. It's just not going to happen. The asteroid is already hurtling towards earth and most people don't seem to experience any sort of urgency due to it. Do we not need to worry about it yet if the time of impact is 30 years from now?

[–] [email protected] 0 points 5 months ago (8 children)

what if Ronald McDonald made a hamburger so delicious that civilisation collapsed? Can you prove it can't happen? Checkmate, athetits

[–] [email protected] 0 points 5 months ago (7 children)

this is explored in Harry Potter and The Methods of Hamburgling, a 10,000 chapter Harry Potter / McDonaldland crossover fiction

[–] [email protected] 0 points 5 months ago (1 children)

I unironically kinda want to read that.

Luckily LLMs are getting better at churning out bullshit, so pretty soon I can read wacky premises like that without a human having to degrade themselves to write it! I found a new use case for LLMs!

[–] [email protected] 0 points 5 months ago (1 children)
[–] [email protected] 0 points 5 months ago

Poof, species extinct.

load more comments (5 replies)
load more comments (5 replies)
load more comments (45 replies)