this post was submitted on 03 Apr 2024
962 points (99.4% liked)

Technology

59331 readers
5262 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A judge in Washington state has blocked video evidence that’s been “AI-enhanced” from being submitted in a triple murder trial. And that’s a good thing, given the fact that too many people seem to think applying an AI filter can give them access to secret visual data.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 66 points 7 months ago (7 children)

Jesus Christ, does this even need to be pointed out!??

[–] [email protected] 24 points 7 months ago (1 children)

I met a student at university last week at lunch who told me he is stressed out about some homework assignment. He told me that he needs to write a report with a minimum number of words so he pasted the text into chatGPT and asked it about the number of words in the text.

I told him that every common text editor has a word count built in and that chatGPT is probably not good at counting words (even though it pretends to be good at it)

Turns out that his report was already waaaaay above the minimum word count and even needed to be shortened.

So much about the understanding of AI in the general population.

I'm studying at a technical university.

[–] [email protected] 3 points 7 months ago

I’m studying at a technical university.

AI is gonna fuck up an entire generation or more.

[–] [email protected] 17 points 7 months ago (2 children)

There's people who still believe in astrology. So, yes.

[–] [email protected] 2 points 7 months ago

Good god, there are still people who believe in phrenology!

[–] [email protected] 3 points 7 months ago

And people who believe the Earth is flat, and that Bigfoot and the Loch Ness Monster exist, and there are reptillians replacing the British royal family...

People are very good at deluding themselves into all kinds of bullshit. In fact, I posit that they're better even at it than learning the facts or comprehending empirical reality.

[–] [email protected] 19 points 7 months ago (1 children)

The layman is very stupid. They hear all the fantastical shit AI can do and they start to assume its almighty. Thats how you wind up with those lawyers that tried using chat GPT to write up a legal brief that was full of bullshit and didnt even bother to verify if it was accurate.

They dont understand it, they only know that the results look good.

[–] [email protected] 9 points 7 months ago* (last edited 7 months ago)

The layman is very stupid. They hear all the fantastical shit AI can do and they start to assume its almighty. Thats how you wind up with those lawyers that tried using chat GPT to write up a legal brief that was full of bullshit and didnt even bother to verify if it was accurate.

Especially since it gets conflated with pop culture. Someone who hears that an AI app can "enhance" an image might think it works like something out of CSI using technosmarts, rather than just making stuff up out of whole cloth.

[–] [email protected] 17 points 7 months ago (1 children)

Of course, not everyone is technology literate enough to understand how it works.

That should be the default assumption, that something should be explained so that others understand it and can make better, informed, decisions. .

[–] [email protected] 1 points 7 months ago

It's not only that everyone isn't technologically literate enough to understand the limits of this technology - the AI companies are actively over-inflating their capabilities in order to attract investors. When the most accessible information about the topic is designed to get non-technically proficient investors on board with your company, of course the general public is going to get an overblown idea of what the technology can do.

[–] [email protected] -1 points 7 months ago

Its not actually worse than eyewitness testimony.

This is not an endorsement if AI, just pointing out that truth has no place in a courtroom, and refusing to lie will get you locked in a cafe.

Too good, not fixing it.

[–] [email protected] 33 points 7 months ago (1 children)

Yes. When people were in full conspiracy mode on Twitter over Kate Middleton, someone took that grainy pic of her in a car and used AI to “enhance it,” to declare it wasn’t her because her mole was gone. It got so much traction people thought the ai fixed up pic WAS her.

[–] [email protected] 8 points 7 months ago

Don't forget people thinking that scanlines in a news broadcast over Obama's suit meant that Obama was a HOLOGRAM and ACTUALLY A LIZARD PERSON.

[–] [email protected] 43 points 7 months ago* (last edited 7 months ago) (1 children)

Unfortunately it does need pointing out. Back when I was in college, professors would need to repeatedly tell their students that the real world forensics don't work like they do on NCIS. I'm not sure as to how much thing may or may not have changed since then, but based on American literacy levels being what they are, I do not suppose things have changed that much.

[–] [email protected] 8 points 7 months ago (1 children)
[–] [email protected] 7 points 7 months ago (1 children)

Its certainly similar in that CSI played a role in forming unrealistic expectations in student's minds. But. Rather than expecting more physical evidence in order to make a prosecution, the students expected magic to happen on computers and lab work (often faster than physically possible).

AI enhancement is not uncovering hidden visual data, but rather it generates that information based on previously existing training data and shoe horns that in. It certainly could be useful, but it is not real evidence.

[–] [email protected] 4 points 7 months ago