Main, home of the dope ass bear.
THE MAIN RULE: ALL TEXT POSTS MUST CONTAIN "MAIN" OR BE ENTIRELY IMAGES (INLINE OR EMOJI)
(Temporary moratorium on main rule to encourage more posting on main. We reserve the right to arbitrarily enforce it whenever we wish and the right to strike this line and enforce mainposting with zero notification to the users because its funny)
A hexbear.net commainity. Main sure to subscribe to other communities as well. Your feed will become the Lion's Main!
Good comrades mainly sort posts by hot and comments by new!
State-by-state guide on maintaining firearm ownership
Domain guide on mutual aid and foodbank resources
Tips for looking at financials of non-profits (How to donate amainly)
Community-sourced megapost on the main media sources to radicalize libs and chuds with
Main Source for Feminism for Babies
Maintaining OpSec / Data Spring Cleaning guide
Remain up to date on what time is it in Moscow
view the rest of the comments
I tried using AI to help find sources for my partners thesis. It's a niche topic on body phenomenology and existentialism in pregnancy and birth. Instead, it cited Heidegger books that don’t even exist. A colleague recommended it, but honestly, you would have to be insane to rely on this.
everyone agrees search engines suck shit now but somehow the "search engine" that just makes shit up is cool :agony:
LLMs in general cannot handle finding quotes because they can't discern between real ideas or regurgitating ideas in a slightly different format.
I get so annoyed when people tell me to ask an AI something. It has no knowledge and no capacity for reason. The only thing it can do is produce an output that an inexpert human could potentially accept as true because the underlying statistics favour sequences of characters that, when converted to text and read by a human, appear to have a confident tone. People talk about AI hallucinating wrong answers and that's giving it too much credit; either everything it outputs is a hallucination that's accepted more often than not, or nothing it outputs is a hallucination because it's not conscious and can't hallucinate, it's just printing sequential characters.
It's advanced autocorrect. Calling it AI is an insult to Skynet.
it's not all that far from the "post what your autocorrect completes this sentence as" thing
the llms are considerably more sophisticated sure, but what they do is fundamentally the same
The more specific the information the more it lies