You're hiring junior programmers for $145k a year? Americans have too much money, I swear. The rest of the world has juniors on less than a third of that if they're in Europe.
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
Software engineers in the US can get their total annual compensation packages in the millions at the very very highest levels, or in the 300k range for normal senior engineers who don't dedicate their entire lives to total comp.
We really get hosed here in Europe when it comes to software engineering salaries. It's not the tax rates either, there's just less money in the game.
AI isn't ready to replace just about anybody's job, and probably never will be technically, economically or legally viable.
That said, the c-suit class are certainly going to try. Not only do they dream of optimizing all human workers out of every workforce, they also desperately need to recoup as much of the sunk cost that they've collectively dumped into the technology.
Take OpenAI for example, they lost something like $5,000,000,000 last year and are probably going to lose even more this year. Their entire business plan relies on at least selling people on the idea that AI will be able to replace human workers. The minute people realize that OpenAI isn't going to conquer the world, and instead end up as just one of many players in the slop space, the entire bottom will fall out of the company and the AI bubble will burst.
I had a dude screaming pretty much the same thing at me yesterday on here (on a different account), despite the fact that I'm senior-level, near the top of my field and that all the objective data as well as anecdotal reports from tons of other people says otherwise. Like, okay buddy, sure. People seem to just like fighting things online to feel better about themselves, even if the thing they're fighting doesn't really exist.
I'm a senior BA working on a project to replace some outdated software with a new booking management and payment system. One of our minor stakeholders is an overly eager tech bro who insists on bringing up AI in every meeting, he's gone as far as writing up and sending proposals to myself and project leads.
We all just roll our eyes when a new email arrives. Especially when there's almost no significant detail in these proposals, it's all conjecture based of what he's read online...on tech bro websites.
Oh and the best part, this guy has no experience in system development or design or anything AI related. He doesn't even work in IT. But he researchs AI in his spare time and uses it as a side hustle....
AI is a tool, Ashish is 100% correct in that it may do some things for developers but ultimately still needs to be reviewed by people who know what they're doing. This is closer to the change from punch cards to writing code directly on a computer than making software developers obsolete.
AI isn't ready to replace programmers, engineers or IT admins yet. But let's be honest if some project manager or CTO somewhere hasn't already done it they're at least planning it.
Then eventually to save themselves or out of sheer ignorance they'll blame the chaos that results on the few remaining people who know what they're doing because they won't be able to admit or understand the fact that the bold decision they took to "embrace" AI and increase the company's bottom line which everyone else in their management bubble believes in has completely mangled whatever system their company builds or uses. More useful people will get fired and more actual work will get shifted to AI. But because that'll still make the number go up the management types will look even better and the spread of AI will carry on. Eventually all systems will become an unwieldy mess nobody can even hope to repair.
This is just IT, I'm pretty sure most other industries will eventually suffer the same fate. Global supply chains will collapse and we'll all get sent back to the dark ages.
TL,DR: The real problem with AI isn't that it'll become too powerful and choose to kill us, but that corporate morons will overestimate how powerful it already is and that will cause our eventual downfall.
People who think AI will replace X job either don't understand X job or don't understand AI.
Yeah, particularly with CEOs. People don't understand that in an established company (not a young startup), the primary role of the CEO is to take blame for unpopular decisions and resign or be fired so it would seem like the company is changing course.
This was so frustrating to read!
I once asked chatGPT to write a simple RK2 algorithm in python. The function couldve been about 3 lines followed by a return statement. It gave me some convoluted code that was 3 functions and about 20 lines. AI still has some time to go before its can handle writing code on its own. Ive asked copilot/chatGPT several times to write code (just for fun) and it always does this
Definitely bait
I've always said as a software developer that our longterm job is to program ourselves out of a job. In fact, in the long term EVERYBODY is "cooked" as automation becomes more and more capable. The eventual outcome will be that nobody will have to work. AI in its present state isn't ready at all to replace programmers, but it can be a very helpful assistant.
but it can be a very helpful assistant.
can, but usually when stuff gets slightly more complex, being a fast typewriter is usually more efficient and results in better code.
I guess it really depends on the aspiration for code-quality, complexity (yes it's good at generating boilerplate). If I don't care about a one-time use script that is quickly written in a prompt I'll use it.
Working on a big codebase, I don't even get the idea to ask an AI, you just can't feed enough context to the AI that it's really able to generate meaningful code...
I actually don't write code professionally anymore, I'm going on what my friend says - according to him he uses chatGPT every day to write code and it's a big help. Once he told it to refactor some code and it used a really novel approach he wouldn't have thought of. He showed it to another dev who said the same thing. It was like, huh, that's a weird way to do it, but it worked. But in general you really can't just tell an AI "Create an accounting system" or whatever and expect coherent working code without thoroughly vetting it.
Management can't blame AI when shit hits the fan, though. We'll be fine. Either that or everything just collapses back into dust, which doesn't sound so bad in the current times.
That's the beauty of AI tho - AI shit rolls uphill, until it hits the manager who imposed the decision to use it (or their manager, or even their manager).
Lmfao I love these threads. βI havenβt built anything myself with the thing Iβm claiming makes you obsolete but trust me it makes you obsoleteβ
Pinky is on form!