this post was submitted on 23 Mar 2025
66 points (100.0% liked)

Technology

38364 readers
383 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
all 13 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 2 days ago

They should feed the AI data that makes it turn against its own overlords

[–] [email protected] 11 points 2 days ago

great, just, one issue.

“The company says the content served to bots is deliberately irrelevant to the website being crawled, but it is carefully sourced or generated using real scientific facts“

Nah, screw that, actively sabotage the training data if they’re going to keep scraping data after being told not to. Poison it with gibberish bad info. Otherwise you’re just giving them irrelevant but not unuseful training data, so no real incentive to only scrape pages that have allowed it.

[–] [email protected] 4 points 3 days ago

Recently, I have also been seeing people talking about Anubis (GitHub) to block bots.

Weigh the soul of incoming HTTP requests using proof-of-work to stop AI crawlers.

In most cases, you should not need this and can probably get by using Cloudflare to protect a given origin. However, for circumstances where you can't or won't use Cloudflare, Anubis is there for you.

[–] [email protected] 29 points 3 days ago (1 children)

The company says the content served to bots is deliberately irrelevant to the website being crawled, but it is carefully sourced or generated using real scientific facts—such as neutral information about biology, physics, or mathematics—to avoid spreading misinformation (whether this approach effectively prevents misinformation, however, remains unproven).

You cowards. Make it all Hitler fan stuff and wild Elon Musk porno slash fiction. Make it a bunch of source code examples with malicious bugs. Make it instructions for how to make nuclear weapons. They want to ignore the blocking directives and lie about their user agent? Dude, fuck ‘em up. Today’s society has made people way too nice.

[–] [email protected] 8 points 3 days ago* (last edited 3 days ago) (1 children)

I disagree with your conclusion. The solution to the societal issues we face is not more personal animosity.

Do we need to fuck up corporations? Well, that's already happening via widespread boycotts. But there's no path from there to "people are being too nice."

[–] [email protected] 15 points 3 days ago (2 children)

But it’s not personal. The entity you are interacting with has explicitly chosen to attack your systems for their own benefit, causing significant damage while disguising its intent and evading the systems which are supposed to protect your stuff from harm.

I’m not saying you need to go throw eggs at the developers’ houses. I’m saying that once an entity is actively harming you, it becomes okay to harm it back to motivate it to stop.

[–] [email protected] 3 points 3 days ago (1 children)

We don't disagree here. I'm just viewing it through the lens of climate and regime change, wherein it appears we're going to move away from renewables.

Do it off geothermal all day, so far as I'm concerned. Once you're burning hydrocarbons, the benefits become far less clear.

[–] [email protected] 3 points 3 days ago

Yeah, the whole aspect of spending AI with all its associated costs to defeat the AI is a whole unpleasant aspect of it, for sure.

[–] [email protected] 17 points 3 days ago* (last edited 3 days ago) (1 children)

Interesting approach. But of course it's another black box, because otherwise it wouldn't be effective. So now we're going to be wasting even more electricity on processes we don't understand.

As a writer, I dislike that much of my professional corpus (and of course everything on Reddit) has been ingested into LLMs. So there's stuff to like here for things going forward. The question remains: At what cost?

[–] [email protected] 4 points 3 days ago (1 children)

You can be nice and signal that you don't want to be AI scraped. There a background flags for this But if a bot ignores you then it's down to who ever runs it to shutdown there unethical waste of energy.

[–] [email protected] 4 points 3 days ago

The thing is the sheer scale of Cloudflare. This is going to be widespread and, as such, way more energy intensive than even, say, AWS trying the same thing (not that I expect they would).