Ironically 'a'++
works in C/C++ because 'a'
is char
where in JS 'a' is string
.
simontherockjohnson
Yeah you're actually right, it's an int
in C since K&R C didn't have bool
, however it's a bool
in C++. I forget my standards sometimes, because like I said this doesn't really matter. It's just nerd trivia.
https://en.cppreference.com/w/cpp/types/type_info/operator_cmp.html
There are plenty of sha1 implementations that are more readable and sensible and less readable and sensible. This portion is simply an manually unrolled loop (lmao these gcc nerds haven't even heard of Gentoo) of the hash chunk computation rounds. Hash functions aren't "impenetrable" they're just math. You can write math programmatically in a way that explains the math.
The point of this post is actually things like x[(I-3)&0x0f]
. It's entirely the same concept as coercion to manipulate index values this way. What's funny is that void pointer math, function pointer math, void pointers and function pointers in general are typically seen as "beyond the pale" for whatever reason.
Beyond that if you know C you know why this is written this way with the parens. It's because C has fucked up order of operations. For example a + b == 7
is literally "does adding a + b equal 7", but if you write a & b == 7
you would think it means "does a AND b equal 7", but you'd be wrong. It actually means does b equal 7 AND a.
Furthermore a & (b ==7)
makes no sense because b == 7 is a boolean value. Bitwise ANDing a boolean value should not work because the width of the boolean is 1 bit and the width of the int is 8 bits. ANDing should fail because there's 7 void bits between the two types. However the standard coerces booleans in these cases to fit the full width, coercing the void bits to 0's to make bitwise ANDing make sense.
Beyond that asking what the memory size of a variable in C is a fools errand because the real answer is "it depends" and "it also depends if someone decided to ignore what it typically depends on (compiler and platform) with some preprocessor fun". Remember how I said "void pointers" are beyond the pale? Yeah the typical "why" of that is because they don't have a known size, but remember the size of something for C is "it depends". 🤷
Almost every language has idiosyncratic stuff like this, but some let you make up your own shit on top of that. These kinda low hanging fruit jokes are just people virtue signaling their nerddom (JS bad am rite guis, use a real language like C), when in reality this stuff is everywhere in imperative languages and typically doesn't matter too much in practice. This isn't even getting into idiosyncracies based on how computers understand numbers which is what subtracting from 0x5F3759DF
(fast inverse square root) references.
I thank god every day people who make these comics are too stupid to open gcc's sha1.c because they'd see shit like:
#define M(I) ( tm = x[I&0x0f] ^ x[(I-14)&0x0f] \
^ x[(I-8)&0x0f] ^ x[(I-3)&0x0f] \
, (x[I&0x0f] = rol(tm, 1)) )
#define R(A,B,C,D,E,F,K,M) do { E += rol( A, 5 ) \
+ F( B, C, D ) \
+ K \
+ M; \
B = rol( B, 30 ); \
} while(0)
R( a, b, c, d, e, F1, K1, x[ 0] );
R( e, a, b, c, d, F1, K1, x[ 1] );
R( d, e, a, b, c, F1, K1, x[ 2] );
R( c, d, e, a, b, F1, K1, x[ 3] );
R( b, c, d, e, a, F1, K1, x[ 4] );
R( a, b, c, d, e, F1, K1, x[ 5] );
R( e, a, b, c, d, F1, K1, x[ 6] );
R( d, e, a, b, c, F1, K1, x[ 7] );
R( c, d, e, a, b, F1, K1, x[ 8] );
R( b, c, d, e, a, F1, K1, x[ 9] );
R( a, b, c, d, e, F1, K1, x[10] );
R( e, a, b, c, d, F1, K1, x[11] );
R( d, e, a, b, c, F1, K1, x[12] );
R( c, d, e, a, b, F1, K1, x[13] );
R( b, c, d, e, a, F1, K1, x[14] );
R( a, b, c, d, e, F1, K1, x[15] );
R( dee, dee, dee, baa, dee, F1, K1, x[16] );
R( bee, do, do, dee, baa, F1, K1, x[17] );
R( dee, bee, do, dee, dee, F1, K1, x[18] );
R( dee, dee, dee, ba, dee, F1, K1, x[19] );
R( d, a, y, d, o, F1, K1, x[20] );
And think, yeah this is real programming. Remember the difference between being smart and incredibly stupid is what language you write it in. Using seemingly nonsensical coercion and operator overloaded is cringe, making your own nonsensical coercion and operator overloads is based.
That's why you should never subtract things from 0x5F3759DF
in any language other than C.
This is only really useful in low expressiveness languages where there is not a huge set of language enhancements possible through libraries. Think Java exception handling for example.
In essence it works if you "best practices" are things like don't use switch statements.
It doesn't work if you best practices are things like use Result<T, E>
from this functional result library.
Essentially LLMs don't really work "at scale" if you need anything more complicated than what the average internet tutorial code is in your language.
Same with perf.
Also this only works 60% of all the time though if that, so the more requirements you pile on the less likely it will hit all of them properly.
Hardware can't really have "tech debt" in the same way as software. Hardware is a physical entity, each computer is a different computer, they're the same model, the same design, but they're different computers. Each installation of software is a direct copy. If we're on the same architecture and the same version, we're running the same Firefox unless something is wrong with Mozilla.
I think hardware that's outdated is bound to happen. As a hobbyist I have my own share of "outdated hardware". In reality that shit still works. I can pull an old laptop and put Fedora Silverblue on it today and it will work just fine for surfing the web, writing on forums, doing a good amount of hobbyist software stuff, etc.
And there in lies the problem, that much of the lifecycle of hardware is directly tied to software support and typically very strongly to bad commercial software. We can give people reasons to not upgrade and we'll write better software for it. Some of the best software is effectively eternal, for example I have used vim my entire professional career even when I was writing Java.
I think the biggest problems is that there's too much hardware and proprietary hardware being made now a days, and not enough hobbyists to get it basic support. For example unless the landscape changes in 6 years I will likely have no way to revive full functionality for my M1 Apple silicon.
But that's PC's, the more egregious things are smaller form factor devices. Android has been the biggest disappointment for me to be honest. What was sold as a "Linux Phone" gave you none of the technical benefits of Linux. So much small form factor stuff essentially becomes ewaste. The small amount of platforms that gain hobbyist support are extremely rare and limited. This is exacerbated by tight integration between physical devices to server side software as a service platforms.
If the libre movement was not a hollowed out husk of it's former self and the economic conditions were able to create a new set of leaders for it we would have
-
GPLv4 that requires you to license as GPLv4 if you use any remote procedure call regardless of medium that executes GPLv4 code.
-
GPLv4.1 that requires any device where GPLv4.1 code comes factory installed must have a fully documented and unlocked bootloader and/or user serviceable firmware flash functionality
-
GPLv5 that requires you to license as GPLv5 if you have any use of GPLv5 code in the tool/supply chain of a software for examle if FoxConn is using gnutls and you use a MacBook you're licensing as GPLv5, if you are a GPLv5 compiler, you're licensing as GPLv5
-
GPLv6 that makes legal to execute your landlord if they charge you rent and any GPLv6 code is used by them directly or indirectly
That would really fix some things regarding ewaste and frankly housing. TBH I think we're gonna see general computing calm the fuck down in the next 10-20 years compared to the onslaught of release cycles in the late 2000's and 2010's. The only real possible driver is going to be if games really glom on to ray tracing bullshit beyond the AAA contractually obligated messes.
I was literally explaining the context of a project I'm working on to a mid-level exec and I was explaining the incentive structure of our team (with a graph that had directionality and weight) within the business. The structure was typical and showed that the strongest incentives (e.g. the things that have strongest ability to decide roadmaps, implementation and prioritization) exist surprise surprise outside the team. It's incredibly bad because there's like 7 strong outflows (lovely bold lines leaving a big box with our org label on it containing our teams as nodes tells a great simple story), in our structure to various stakeholders, and only middling and weak inter-team flows. I literally used the term "we have responsibility without authority" in the exec summary. You're 100% preaching to the choir here.
I'm stealing this one, because it's a very apt description:
This is in reference to an ancient linux meme cw: slur