this post was submitted on 29 Jan 2024
242 points (90.6% liked)

Technology

34441 readers
184 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 24 points 7 months ago (1 children)

Monetisation?

Licensing the site to AI when there's finally a ruling they can't just scrape the internet for training data while ignoring copyright.

[–] [email protected] 18 points 7 months ago (2 children)

I was told reddit has already been scraped for AI and all sorts of stuff. There is very little new value to sell.

[–] [email protected] 10 points 7 months ago (3 children)

Except AI models may end up having to start again with licences or public domain data.

They are currently breaking the law and delaying legal action as long as possible in the hopes they can repeat the trick with a new data set.

[–] [email protected] 2 points 7 months ago

No they’ll train on laundered model output. Like every llama.

The investment thesis they the data is valuable is bonkers. It’s not. Not only has it been exfiltrated and can be laundered in a dozen ways, Reddit also won’t be able to effectively assert copyright.

Look at Facebook. It’s full of reposted quora content now with AI images and AI laundered text.

Reddit is dead

[–] [email protected] 2 points 7 months ago (1 children)

Whatever already existed won't be thrown away regardless of the ruling. It's like throwing all the gold already dug up just because it was done by slave labor. The law and legal actions are mostly just a moat around the pile of gold already dug up. Sure AI companies will have to pay more for the new data from other sources. However that would be peanut compared to how much they will have to pay starting from zero.

[–] [email protected] 1 points 7 months ago (1 children)

If every time what already exists gets used there's a risk of a massive fine or court case they'll throw it away.

The game now is to delay the legal process long enough until they've built the replacement.

Then they can afford to throw the, essentially faulty, model away.

[–] [email protected] 1 points 7 months ago (1 children)

It's not at all clear that the current model does breach the law.

If it was a court would have issued an injunction or whatever.

[–] [email protected] 1 points 7 months ago (1 children)

It's clear from the output that it breaks copyright.

We don't have to look inside the black box to demand to see the input which caused that output.

To be clear a machine is not responsible for itself. This machine was trained to break copyright.

[–] [email protected] 0 points 7 months ago (1 children)

Generally if someone is clearly in breach of copyright the rights holder will apply to a court to issue an injunction to order that company to cease their activities until a case can be resolved.

Given that has not happened, it seems that from a court's perspective, it's not a clear breach of copyright.

[–] [email protected] 1 points 7 months ago (1 children)

The rights holder first considers the size of the payout vs. the cost of legal fees.

Just because they haven't been sued directly for this doesn't make it infringement.

[–] [email protected] 0 points 7 months ago (1 children)

Nonsense. If this is copyright the payout will be many billions. They've had a year to think about it.

[–] [email protected] 1 points 7 months ago (1 children)

The statute of limitations is much longer than a year. It's usually around 5.

They can wait, see who's made the money, then target them for a payout.

[–] [email protected] 1 points 7 months ago (1 children)

A court wouldn't look favourably on that.

Rights couldn't have been very b important if you just let it run.

[–] [email protected] 1 points 7 months ago

They really don't care. It can take a lot of time to put a solid case together and you're better off having a solid case than a quick trial.

[–] [email protected] 2 points 7 months ago (1 children)

Corporations break the law all the time and typically it's just an operational expense.

[–] [email protected] 1 points 7 months ago (1 children)

Typically they aren't fighting other corporations.

[–] [email protected] 1 points 7 months ago (1 children)

I don't understand what you're saying because I never said they were.

[–] [email protected] 1 points 7 months ago

My point is that corporations often see a fine as a cost of business because the fines are issued by a regulatory system that has no teeth.

If you're in a lawsuit against another corporation they are going after damages in civil court and it's likely to be a high enough fine to stop the behaviour.

[–] [email protected] 6 points 7 months ago

Yeah that was kinda my understanding too. And regardless of my feelings on it, I think rulings are mostly gonna go in AI’s favor.