technohacker

joined 1 year ago
[–] [email protected] 0 points 1 month ago* (last edited 1 month ago) (4 children)

But that's more "Don't underestimate my power" than "You underestimate my power" tho

For added pedanticity, it translates near-literally to "Don't think less of my power"

[–] [email protected] 4 points 1 month ago

THE PLOT NO LONGER CLUMPENS!

[–] [email protected] 29 points 1 month ago (2 children)

I hate that I recognise that video playing on the TV, damn you Gianni

[–] [email protected] 6 points 1 month ago

The Bet, by Anton Chekov. That story has given me my existentialism

[–] [email protected] 0 points 1 month ago (1 children)

HELL YEAH, BROTHER!

[–] [email protected] 8 points 2 months ago

Aaaaabsolutely.

That being said, the only thing that's getting close to my Sidebery tree tabs is LogSeq's graph, and it's a close competition. Might end up using the two simultaneously

[–] [email protected] 8 points 2 months ago

Oh man, is there a community for Wiki-style battle summaries like this but for non-battles? This one's doing a number on me

[–] [email protected] 2 points 3 months ago

There's a whole bunch in Cities at least. I've seen several in Bangalore

[–] [email protected] 192 points 3 months ago (24 children)

Technology Connections is a nice breath of fresh air in the YouTube space if you want something tech related

[–] [email protected] 3 points 3 months ago

I remember this old website that the YouTube team had made which visualised the amount of video time getting uploaded per day on YouTube over the years of its existence, and it was on the order of several years per day or something. Gotta find that site again

[–] [email protected] 1 points 3 months ago

Add HMD Nokia to the blocking unlocks completely camp

[–] [email protected] 7 points 3 months ago* (last edited 3 months ago)

They stand for Floating Point 16-bit, 8-bit and 4 bit respectively. Normal floating point numbers are generally 32 or 64 bits in size, so if you're willing to sacrifice some range, you can save a lot of space used by the model. Oh, and it's about the model rather than the GPU

view more: ‹ prev next ›