this post was submitted on 07 May 2025
193 points (89.1% liked)

Not The Onion

16301 readers
1301 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 2 weeks ago

Fucking disgusting

[–] [email protected] 2 points 2 weeks ago

This headline lies.

[–] [email protected] 0 points 2 weeks ago

Rape and stupid nonsense...that is all ai is.

[–] [email protected] 4 points 2 weeks ago (4 children)

This bring up an interesting question I like to ask my students about AI. A year or so ago, Meta talked about people making personas of themselves for business. Like if a customer needs help, they can do a video chat with an AI that looks like you and is trained to give the responses you need it to. But what if we could do that just for ourselves, but instead let an AI shadow us for a number of years so it essentially can mimic the language we use and thoughts we have enough to effectively stand in for us in casual conversations?

If the murdered victim in this situation had trained his own AI in such a manner, after years of shadowing and training, would that AI be able to mimic its master’s behavior well enough to give its master’s most likely response to this situation? Would the AI in the video have still forgiven the murderer, and would it hold more significant meaning?

If you could snapshot you as you are up to right now, and keep it as a “living photo” A.I. that would behave and talk like you when interacted with, what would you do with it? If you could have a snapshot AI of anyone in the world in a picture frame on your desk, who you could talk to and interact with, who would you choose?

[–] [email protected] 1 points 2 weeks ago

So… Who Framed Roger Rabbit?

The book not the movie.

[–] [email protected] 5 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

it would hold the same meaning as now, which is nothing.

this is automatic writing with a computer. no matter what you train on, you're using a machine built to produce things that match other things. the machine can't hold opinions, can't remember, can't answer from the training data. all it can do is generate a plausible transcript of a conversation and steer it with input.

one person does not generate enough data during a lifetime so you're necessarily using aggregated data from millions of people as a base. there's also no meaning ascribed to anything in the training data. if you give it all a person's memories, the output conforms to that data like water conforms to a shower nozzle. it's just a filter on top.

in regards to the final paragraph, i want computers to exhibit as little personhood as possible because i've read the transcript of the ELISA experiments. it literally could only figure out subject-verb-object and respond with the same noun as it was fed, and people were saying it should replace psychologists.

[–] [email protected] 4 points 2 weeks ago (1 children)

The deceased's sister wrote the script. AI/LLMs didnt write anything. It's in the article. So the assumptions you made for the middle two paragraphs dont really apply to this specific news article.

[–] [email protected] 1 points 2 weeks ago (1 children)

i was responding to the questions posted in the comment i replied to.

also, doesn't that make this entire thing worse?

[–] [email protected] 3 points 2 weeks ago (1 children)

also, doesn’t that make this entire thing worse?

No? This is literally a Victim Impact Statement. We see these all the time after the case has determined guilt and before sentencing. This is the opportunity granted to the victims to outline how they feel on the matter.

There have been countless court cases where the victims say things like "I know that my husband would have understood and forgiven [... drone on for a 6 page essay]" or even done this exact thing, but without the "AI" video/audio (home videos with dubbed overlay of a loved one talking about what the deceased person would want/think about it). It's not abnormal and has been accepted as a way for the aggrieved to voice their wishes to the court. All that's changed here was the presentation. This didn't affect the finding of if the person was guilty as it was played after the finding and was only played before sentencing. This is also the customary time where impact statements are made. The "AI" didn't make the script. This is just a mildly fancier impact statement and that's it. She could have dubbed it over home video with a fiverr voice actor. Would that change how you feel about it? I see no evidence that the court treated this anything different than any other impact statement. I don't think anyone would be fooled that the dead person is magically alive and directly making the statement. It's clear who made it the whole time.

[–] [email protected] 1 points 2 weeks ago (1 children)

i had no idea this was a thing in american courts. it just seems like an insane thing to include in a murder trial

[–] [email protected] 1 points 2 weeks ago

Those statements come after the trial during the sentencing phase. They're not used to sway the initial verdict.

[–] [email protected] 1 points 2 weeks ago

I wouldn't want to talk to AI either. Just have it send me a voicemail recording of the video, but transcribed into a text, into my spam folder.

[–] [email protected] 2 points 2 weeks ago

Obvious ragebait article

[–] [email protected] 59 points 2 weeks ago

WTF?

That man did not say anything. A computer algorithm smashed a video together they incidentally uses his likeness, nothing more

[–] [email protected] 55 points 2 weeks ago (1 children)

The fuck is wrong with people.

[–] [email protected] 16 points 2 weeks ago

I'm glad I'm not the only one thinking this

[–] [email protected] 52 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

There is absolutely zero chance I would allow anyone to theorize what they think I would say using AI. Hell, I don’t like AI in its current state, and that’s the least of my issues with this.

It’s immoral. Regardless of your relation to a person, you shouldn’t be acting like you know what they would say, let alone using that to sway a decision in a courtroom. Unless he specifically wrote something down and it was then recited using the AI, this is absolutely wrong.

It’s selfish. They used his likeness to make an apology they had no possible way of knowing, and they did it to make themselves feel better. They couldve wrote a letter with their own voices instead of turning this into some weird dystopian spectacle.

“It’s just an impact statement.”

Welcome to the slippery slope, folks. We allow use of AI into courtrooms, and not even for something cool (like quickly producing a 3d animation of a car accident for use in explaining—with actual human voices—what happened at the scene). Instead, we use it to sway a judge’s sentencing, while also making an apology on behalf of a dead person (using whatever tech you want because that is not the main problem here) without their consent or even any of their written (you know, like in a will) thoughts.

Pointing to “AI bad” for these arguments is lazy, reductive, and not even remotely the main gripe.

[–] [email protected] 3 points 2 weeks ago

allow use of AI into courtrooms

Surprised the judge didn't kick that shit to the curb. There was one case where the defendant made an AI avatar, with AI generated text, to represent himself and the judge said, "Fuck outta here with that nonsense."

[–] [email protected] 2 points 2 weeks ago

There is absolutely zero chance I would allow anyone to theorize what they think I would say using AI.

If they based it on my Reddit history it's got potential to be needlessly harsh to certain groups of life-underachievers, that's for sure.

[–] [email protected] 14 points 2 weeks ago

Why even do an impact statement? All Christian victims should be assumed to forgive their attackers, right?

[–] [email protected] 15 points 2 weeks ago

An AI version of Christopher Pelkey appeared in an eerily realistic video to forgive his killer… “In another life, we probably could’ve been friends. I believe in forgiveness, and a God who forgives.”

"...and while it took my murder to get my wings as an angel in heaven, you still on Earth can get close with Red Bull ™. Red Bull ™ gives you wings!" /s

[–] [email protected] 51 points 2 weeks ago (1 children)

I swear to Christ, if I get murdered and my family makes an AI video of me forgiving them then I will haunt the shit out of them.

[–] [email protected] 9 points 2 weeks ago (1 children)

Who will you haunt? The murderer or your family?

[–] [email protected] 20 points 2 weeks ago (1 children)
[–] [email protected] 10 points 2 weeks ago (1 children)

Damn right. I might haunt everyone that facilitated the video too.

[–] [email protected] 5 points 2 weeks ago

Yeah that seems like a lot of work but it would be deserved. Maybe those that facilitated it on weekdays, only family and killer on Saturday, take Sunday off.

load more comments
view more: next ›