this post was submitted on 20 Dec 2024
613 points (99.4% liked)
196
16714 readers
1887 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You quite literally cannot trust them, their produced information entropy is too high. I understand how much training they have on medical text, you dont understand how little that means. These models are fundamentally incapable of assessing the truth of a statement, you are using something you dont even understand to give you advice about something it cannot reliably give and lack the expertise needed to understand how accurate they actually are at any given answer, on a topic that directly influences your actual physical wellbeing!
"just try it bro it's good i promise" you should actually prompt an llm about a topic you know about in detail. the amount of errors are rampant, then apply that same inaccuracy to topics you know nothing about.
my next recommendation is that since you are not a healthcare professional, do not give medical advice like "use llm" as you personally clearly cannot verify the accuracy of llm for this role.
If you want to visit a Doctor for every minor thing feel welcome. So far LLM's have correctly predicted every health issue I have had and provided better and more accurate information than the doctor visit afterwards.
This does not mean they are infaillible. But you can easily check what they suggest and see whether the symptoms match other websites and the doctors description.