this post was submitted on 17 Feb 2025
1182 points (99.2% liked)

Microblog Memes

7427 readers
2627 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 4 points 2 months ago

Luckily I have my own "robots" fighting hard to stop me from seeing ads.

[–] [email protected] 9 points 2 months ago
[–] [email protected] 3 points 2 months ago

A machine must never prompt a human to tip it for serving the purpose it was created for.

[–] [email protected] 9 points 2 months ago

No he didn't. The laws were a plot device meant to have flaws.

[–] [email protected] 5 points 2 months ago

And that includes offers to subscribe to Laws of Robotics Premium.

Yes, Amazon. They're still adverts, and you can still go and fucking fuck yourselves.

[–] [email protected] 8 points 2 months ago

Can we just agree that adverisements in general is harmful? So the original first (and zeroth) law is applicable.

[–] [email protected] 10 points 2 months ago

I love it when posts line up like that

[–] [email protected] 3 points 2 months ago (1 children)
[–] [email protected] 2 points 2 months ago

Unless it looks super cool by doing so, like wearing sunglasses and dual- weilding P-90s

[–] [email protected] 5 points 2 months ago (2 children)

I don't know. "Must not kill us, somehow sounds important"

[–] [email protected] 10 points 2 months ago

It's good, but the one about the ads should be higher on the priority list.

[–] [email protected] 1 points 2 months ago

suicide bots sound kinda cool tho 🤔

[–] [email protected] 3 points 2 months ago (1 children)

Wait why is this mutually exclusive to the original laws? Can’t this just be law 4?

[–] [email protected] 6 points 2 months ago (1 children)

No because if it is lower on priority, a robot can be forced to show an AD to a human as per the 2nd law.

[–] [email protected] 1 points 2 months ago

i guess thats fair

[–] [email protected] 60 points 2 months ago* (last edited 2 months ago) (2 children)
  1. A machine must never prompt a human with options of "Yes" and "Maybe later" - they must always provide a "No" option.
[–] [email protected] 3 points 2 months ago

that's what you get for hiring fallout 4 writers to do the job

[–] [email protected] 12 points 2 months ago
  1. A machine must never prompt for a tip or a donation to a charity for tax-evasion reasons. Or any reason. You know what, scratch that, a robot will not needlessly guilt-trip a human.
[–] [email protected] 6 points 2 months ago (1 children)

Law 2: no poking out eyes.

[–] [email protected] 4 points 2 months ago

Law 3: any robot that accidentally kills a human, must make amends by putting together a really nice funeral service.

[–] [email protected] 14 points 2 months ago* (last edited 2 months ago) (2 children)

How about "a robot must have complete loyalty to its owner, even if this is not in the best interests of its manufacturer". Fat chance, I know.

[–] [email protected] 1 points 2 months ago

Owner loyalty is a subscription service, natch.

[–] [email protected] 12 points 2 months ago (1 children)

Technically the laws of robotics already have that.

Law 2: a robot must obey any order given to it by a human as long as such order does not conflict with the first law.

Of course that's little help, because the laws of robotics are intentionally designed not to work.

[–] [email protected] 5 points 2 months ago (1 children)

Wouldn't be much of a short story if they did.

I liked the one where the robot could sense people's emotional pain, and went crazy when it had to deliver bad news.

[–] [email protected] 1 points 2 months ago

Yup, and later Asimov expanded this short story into a saga that brought to the birth of law Zero:

A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

[–] [email protected] 7 points 2 months ago

Love the username, OP!

[–] [email protected] 1 points 2 months ago

The book Hum by Helen Phillips has a fun take on this.

[–] [email protected] 5 points 2 months ago

Let’s introduce musk to the zeroth law

[–] [email protected] 23 points 2 months ago* (last edited 2 months ago) (1 children)

I am very close to adopting the ideals of the Dune universe, post Butlerian Jihad:

"Thou shalt not make a machine in the likeness of a human mind."

Mainly because, us, humans, are very evidently too malicious and incompetent to be trusted with the task.

[–] [email protected] 105 points 2 months ago (2 children)
  1. a robot’s eyes must always turn red when they go evil
[–] [email protected] 22 points 2 months ago (1 children)

Right, because it's hard to make a robot grow a goatee.

[–] [email protected] 5 points 2 months ago (1 children)
[–] [email protected] 5 points 2 months ago

Bender was the evil bender!?

[–] [email protected] 47 points 2 months ago (2 children)

God bless the designer who always installs the blue AND red LEDs inside the eyes

[–] [email protected] 10 points 2 months ago
[–] [email protected] 16 points 2 months ago* (last edited 2 months ago) (1 children)

For giving the robots freedom of choice?

Because obviously if they didn't install the red ones then the robot could never be evil.

[–] [email protected] 8 points 2 months ago

That's exactly what an evil robot without red LEDs would want us to think.

load more comments
view more: next ›