vivendi

joined 1 month ago
[โ€“] [email protected] 6 points 1 week ago (3 children)

What does suckless have to do with that?

[โ€“] [email protected] 5 points 1 week ago* (last edited 1 week ago)

They need </ToS breaking thoughts/>

[โ€“] [email protected] 25 points 1 week ago

Oi bruv don't shoot strays at us

[โ€“] [email protected] 1 points 1 week ago

Animal fat in general is a bad thing for clogged veins, this is also why fastfood and such moved to plant based oils

[โ€“] [email protected] 3 points 1 week ago* (last edited 1 week ago)

Computer Science

Looks inside

Probabilities

Cat.png

Turns out our universe is comically probabilistic

(Also I have markovian math this semester. I think medieval torture is a more merciful fate than this shit)

[โ€“] [email protected] 1 points 1 week ago* (last edited 1 week ago)

This is an Avali, some fictional smol space raptor/avian species

[โ€“] [email protected] 2 points 1 week ago

I don't understand why Gemini is such a disaster. DeepMind Gemma works better and that's a 27B model. It's like there are two separate companies inside Google fucking off and doing their own thing (which is probably true)

[โ€“] [email protected] 0 points 1 week ago* (last edited 1 week ago) (1 children)

Not making these famous logical errors

For example, how many Rs are in Strawberry? Or shit like that

(Although that one is a bad example because token based models will fundamentally make such mistakes. There is a new technique that lets LLMs process byte level information that fixes it, however)

[โ€“] [email protected] 0 points 1 week ago (3 children)

The most recent Qwen model actually works really well for cases like that, but this one I haven't tested for myself and I'm going based on what some dude on reddit tested

[โ€“] [email protected] 0 points 1 week ago (5 children)

This is the most "insufferable redditor" stereotype shit possible, and to think we're not even on Reddit

view more: โ€น prev next โ€บ