diffuselight

joined 1 year ago
[–] [email protected] 3 points 6 months ago

You think giving away ChatGPT for free isn’t distorting a market ?

[–] [email protected] 2 points 7 months ago

No they’ll train on laundered model output. Like every llama.

The investment thesis they the data is valuable is bonkers. It’s not. Not only has it been exfiltrated and can be laundered in a dozen ways, Reddit also won’t be able to effectively assert copyright.

Look at Facebook. It’s full of reposted quora content now with AI images and AI laundered text.

Reddit is dead

[–] [email protected] 0 points 1 year ago

I’m guessing the idea here is to eventually sell out to a corporate operator

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

That’s what a win win looks like. No need to be quiet around it. Russia illegally invaded Ukraine. Now everyone gets to replenish and modernize their weapons, test them in real conditions while making sure Russia gets enough of a bloody nose to not fucking try this shit ever again.

Russia did the ‘fuck around and find out thing’. It was their choice and the only way they can win is by tankies convincing every other country that just saw rape, murder, pillaging and terrorism getting used on another country in Europe by a rabid bear that somehow Russia was justified and should be allowed a free pass. But it’s not working. The rabid bear is rabid, but there’s ways to deal with that.

Because now they makes sure that every country around them is joining the anti rabid bear alliance.

The way the OP framed the article is to create the idea that somehow Russia is good because US military is bad. But that’s a fallacy. The US military is perfectly capable of doing bad shit on behalf of the US, but that does not mean everyone else is good. Sometimes clobbering Nazis is win win and Russia should have know that. Their feeble at reframing may work on Fox brainwashed Republicans who are reduced to “Putins kills gays and is strong so Putin is good”, but it turns out Putin is a cuck taking it into the ass by his own chef.

[–] [email protected] 0 points 1 year ago

Nothing to do with AI, Garbage in, Garbage out.

LLMs are tools that satisfies requests. The developer decided to allow people to put the ingredients for chlorine Gas into the input - LLM never stood a chance but to comply with the instructions to combine them into the end product.

Clear indication we are in the magical witch hunt phase of the hype cycle where people expect the technology to have magical induction capabilities.

We could discuss liability for the developer but somehow I don’t think a judge would react favorably to “So you put razor blades into your bread mixer and want to sue the developer because they allowed you to put razor blades into the bread mixer”