this post was submitted on 15 Dec 2024
1 points (100.0% liked)

TechTakes

1489 readers
18 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 1 week ago (2 children)

These types of errors happen even after including prompts like “Do not hallucinate.”

Genius! Why didn't I think of that!

[–] [email protected] 0 points 1 week ago (1 children)

I've tried screaming "stop overfilling my hboxes" when compiling my TeX document, but it isn't working! Am I prompting it wrong?

[–] [email protected] 0 points 1 week ago

yes. perhaps try more emotional guilt-tripping?

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago) (1 children)

In every RAG guide I've seen, the suggested system prompts always tended to include some more dignified variation of "Please for the love of god only and exclusively use the contents of the retrieved text to answer the user's question, I am literally on my knees begging you."

Also, if reddit is any indication, a lot of people actually think that's all it takes and that the hallucination stuff is just people using LLMs wrong. I mean, it would be insane to pour so much money into something so obviously fundamentally flawed, right?

[–] [email protected] 0 points 1 week ago

Yeah that method is clearly flawed. Not enough incense and prayers to the Machine God, no wonder the Machine Spirit is displeased. All praise the machine god of Mars! Praise the Omnissiah!