this post was submitted on 03 Apr 2024
962 points (99.4% liked)
Technology
59223 readers
3101 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Sure, no drugs involved, but they are running a statistically proven random number generator and using that (along with non-random data) to generate the image.
The result is this - ask for the same image, get two different images — similar, but clearly not the same person - sisters or cousins perhaps... but nowhere near usable as evidence in court:
Tell me you don't know shit about AI without telling me you don't know shit. You can easily reproduce the exact same image by defining the starting seed and constraining the network to a specific sequence of operations.
But if you don't do that then the ML engine doesn't have the introspective capability to realize it failed to recreate an image
And if you take your eyes off of their sockets you can no longer see. That's a meaningless statement.
The point is that the AI 'enhanced' photos have nice clear details that are randomly produced, and thus should not be relied on. Are you suggesting that we can work around that problem by choosing a random seed manually? Do you think that solves the problem?