this post was submitted on 01 Jun 2024
1611 points (98.7% liked)

Technology

58123 readers
4021 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 44 points 3 months ago (18 children)

Iirc cases where the central complaint is AI, ML, or other black box technology, the company in question was never held responsible because "We don't know how it works". The AI surge we're seeing now is likely a consequence of those decisions and the crypto crash.

I'd love CVS try to push a lawsuit though.

[–] [email protected] 31 points 3 months ago (4 children)

In Canada there was a company using an LLM chatbot who had to uphold a claim the bot had made to one of their customers. So there's precedence for forcing companies to take responsibility for what their LLMs says (at least if they're presenting it as trustworthy and representative)

[–] [email protected] 23 points 3 months ago (3 children)

This was with regards to Air Canada and its LLM that hallucinated a refund policy, which the company argued they did not have to honour because it wasn't their actual policy and the bot had invented it out of nothing.

An important side note is that one of the cited reasons that the Court ruled in favour of the customer is because the company did not disclose that the LLM wasn't the final say in its policy, and that a customer should confirm with a representative before acting upon the information. This meaning that the the legal argument wasn't "the LLM is responsible" but rather "the customer should be informed that the information may not be accurate".

I point this out because I'm not so sure CVS would have a clear cut case based on the Air Canada ruling, because I'd be surprised if Google didn't have some legalese somewhere stating that they aren't liable for what the LLM says.

[–] [email protected] -1 points 3 months ago

Yeah the legalise happens to be in the back pocket of sundar pichai. ???

load more comments (2 replies)
load more comments (2 replies)
load more comments (15 replies)