I hate that I have to keep saying this- No one seems to be talking about the fact that by giving their AI a human-like voice with simulated emotions, it inherently makes it seem more trustworthy and will get more people to believe its hallucinations are true. And then there will be the people convinced it's really alive. This is fucking dangerous.
this post was submitted on 22 May 2024
349 points (91.4% liked)
Technology
59055 readers
3173 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
Please keep saying it.
I plan to. It really upsets me.
Doesn't sound anything like Scarlett Johansson
Well it does have some resemblance but other people have voices like her. Are they not allowed to use their voice anymore?
Edit: I guess not
It’s still just LLM and therefore just autocomplete
Some days I'm just an autocomplete
This article is about their voice synthesis product, which works in tandem with their GPT LLMs, but isn't itself an LLM.
Moot point
"Yeah, let's go up against the woman who sued Disney and won What could go wrong!?"
view more: next ›