this post was submitted on 21 May 2024
1 points (100.0% liked)

Technology

1345 readers
141 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

[email protected]
[email protected]


Icon attribution | Banner attribution

founded 11 months ago
MODERATORS
 

cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 5 months ago (1 children)

While I can't say for 100% certain, it would almost assuradly be trained on many abuse images in order to generate more.

[–] [email protected] 0 points 5 months ago (1 children)

No that's not necessarily true at all. You can take a stock stable diffusion model and output these images right Now without additional training. The whole point of diffusion models is that you don't need 100% training data coverage to create new images outside of the original dataset. Having learned the concept of child from any normal image set of children and learning the concept of nudity/porn from legal adult images is more than enough to create a blended concept of the two.

[–] [email protected] 0 points 5 months ago (2 children)

Having learned the concept of child from any normal image set of children

Those children are the abuse victims.

[–] [email protected] 0 points 5 months ago (1 children)

The children don't exist. The concept of child is learned from a series of 1s and 0s

[–] [email protected] 0 points 5 months ago (1 children)

Then it's abuse because the law says so and how the images are generated is irrelevant.

[–] [email protected] 0 points 5 months ago (1 children)

Believe it or not, US law isn't the moral center of the universe not to mention its disproportionate use against groups it doesn't like.

[–] [email protected] 0 points 5 months ago

This case is in the US.

[–] [email protected] 0 points 5 months ago (1 children)

Having never used an AI generator, generic generated images wouldn't be an actual match to the dataset images, right? It would just be generating features it understands to be associated with the concept of a child, which would make the claim that the dataset children are the abuse targets a stretch, unless there's some other direct or indirect harm to them. An immediate exception being a person writing a prompt attempting to create a specific facsimile of an individual.

[–] [email protected] 0 points 5 months ago (1 children)

Section 1466A of Title 18, United States Code, makes it illegal for any person to knowingly produce, distribute, receive, or possess with intent to transfer or distribute visual representations, such as drawings, cartoons, or paintings that appear to depict minors engaged in sexually explicit conduct and are deemed obscene.

That's nice, still illegal.

[–] [email protected] 0 points 5 months ago (1 children)

While true, it also wasn't really what my post was responding to. Thanks though.

[–] [email protected] 0 points 5 months ago (1 children)

The thread is discussing why it's considered abuse if you can't point to a victim. The answer turned out to be "because the law says so."

[–] [email protected] 0 points 5 months ago (1 children)

If you read the law you posted, it doesn't actually address the question of victimhood. Also, I don't really get why you're still trying to force an unrelated point into this part of the discussion. Maybe find another place in the thread where someone thinks it's legal and go talk to them.

[–] [email protected] 0 points 5 months ago (1 children)

If you don't think you sound like you're saying it's legal/should be legal I have very bad news for you.

[–] [email protected] 0 points 5 months ago

Seeing that it's been explicitly stated that that's not the case, you strike me as a dude that cares less about the meaning of words than the opportunity to argue about them.