This assumes it is about output. 20 years of experience tell me it's not about output, but about profits and those can be increased without touching output at all. π€·ββοΈ
Programming
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
*specifically short-term profits. Executives only care about the next quarter and their own incentives/bonuses. Sure the company is eventually hollowed out and left as a wreck, but by then, the C Suite has moved on to their next host org. Rinse and repeat.
Often they only want the illusion of output, just enough to keep the profits eternally rising.
Even if AI is an actual tool that improves the software development speed of human developers (rather than something that ends up taking away in time spending reviewing, correcting and debugging the AI generated code, the time savings it gives in automatically writing the code), it's been my experience in almost 30 years of my career as a Software Engineer that every single tooling improvements that makes us capable of doing more in the same amount of time is eaten up by increasing demands on the capabilities of the software we make.
Thirty years ago user interfaces were either CLI or pretty simple with no animations. A Software Systems was just a software application - it ran on a single machine with inputs and outputs on that machine - not a multi-tiered octopus involving a bunch of back end data stores, then control and data retrieval middle tiers, then another tier doing UI generation using a bunch of intermediate page definition languages and a frontends rendering those pages to a user and getting user input, probably with some local code thrown into the mix. Ditto for how cars are now mostly multiple programs running of various microcontrollers with one or more microprocessors in the mix all talking over a dedicated protocol. Ditto for how your frigging "smart" washing machine talking to your dedicated smartphone app for it probably involves a 3rd machine in the form of some server from the manufacturer and the whole thing is running over TCP/IP and using the Internet (hence depending on a lot more machines with their dedicated software such as Routers and DNS servers) rather than some point-to-point direct protocol (such as Serial) like in the old days.
Anyways, the point being that even if AI actually delivers more upsides than downsides as a tool to improve programmer output, that stuff is going to be eaten up by increasing demands on the complexity of the software we do, same as the benefits of better programming languages were, the benefits of better IDEs were, of the widespread availability of pre-made libraries for just about everything were, of templating were, of the easiness to find solutions for the problem one is facing from other people on the Internet were, of better software development processes were, of source control were, of colaborative development tools were and so on.
Funnily enough, for all those things there were always people claiming it would make the life of programmers easier, when in fact all it did was make the expectations on the software being implemented go up, often just in terms of bullshit that's not really useful (the "smart" washing machine using networking to talk to a smartphone app so that the machine manufacturers can save a few dollars by not putting as many physical controllers in it, is probably a good example)
And it's intentional. Lay off the workers. Implement AI Slop. Slop does sloppy work. Hire back workers as Temps or Contractors. No benefits. Lower pay.
Like all of Capitalism. It's a fucking scam. A conjob. A new innovation in fucking over workers. (Ironically the only "innovation" ever directly produced by Capitalism)
I remember when everyone was saying that companies would need programmers and that every kid should learn programming. Now I realize that companies were promoting that idea so they're be a surplus of programmers competing with each other and companies could underpay and swap out workers quickly.
What do you expect? Half of these decision makers are complete idiots that are just good at making money and think that that means they are smarter than anyone who makes less than them. They then see some new hyped up tech, they chat with ChatGPT and they are dump enough to be floored by it's "intelligence" and now they think it can replace workers but since it's still early, they assume that it will quickly surpass the workers. So in their mind, firing ten programmers and saving like two million a year, while only spending maybe a few tens of thousands a year on AI will be a crazy success that will show how smart they are. And as time goes on and the AI gets better, they will save even more money. So why spend more money to help the programmers improve, when you can just fire them and spend a fraction of it on AI?
Genuinely a bit shocked to see the number of robolovers in these comments. Very weird, very disheartening. No wonder so much shit online doesn't work properly lol
AI-assisted coding [β¦] means more ambitious, higher-quality products
I'm skeptical. From my own (limited) experience, my use-cases and projects, and the risks of using code that may include hallucinations.
there are roughly 29 million software developers worldwide serving over 5.4 billion internet users. That's one developer for every 186 users,
That's an interesting way to look at it, and that would be a far better relation than I would have expected. Not every software developer serves internet users though.
I donβt honestly believe that AI can save me time as a developer. Iβve tried several AI agents and every single one cost me time. I had to hold its hand while it fumbled around the code base, then fix whatever it eventually broke.
Iβd imagine companies using AI will need to hire more developers to undo all the damage the AI does to their code base.
I mostly use AI as advanced autocomplete. But even just using it for documentation is wrong so often that I do't use it for anything more complex than tutorial level.
I got pretty far with cursor.com when doing basic stuff that i have to spend more time looking up documentation than writing code, but I wouldn't trust it with complex usec cases at this point.
I check back every 6 months or so, to keep track of the progress. Maybe I can spent my days as a software developer drinking cocktails by the pool yelling prompts into the machine soon, but so far I am not concerned I'll be replaced anytime soon.
I've found it can just about be useful for "Here's my data - make a schema of it" or "Here's my function - make an argparse interface". Stuff I could do myself but find very tedious. Then I check it, fix its various dumb assumptions, and go from there.
Mostly though it's like working with an over-presumptuous junior. "Oh no, don't do that, it's a bad idea because security! What if (scenario that doesn't apply)" (when doing something in a sandbox because the secured production bits aren't yet online and I need to get some work done while IT fanny about fixing things for people that aren't me).
Something I've found it useful for is as a natural language interface for queries that I don't have the terminology for. As in "I've heard of this thing - give me an overview of what the library does?" or "I have this problem - what are popular solutions to it?". Things where I only know one way to do it and it feels like there's probably lots of other ways to accomplish it. I might well reject those, but it's good to know what else exists.
In an ideal world that information would be more readily available elsewhere but search engines are such a bin fire these days.
I was in the same boat about...3mos ago. But recent tooling is kind of making me rethink things. And to be honest I'm kind of surprised. I'm fairly anti-AI.
Is it perfect? Fuck no. But with the right prompts and gates, I'm genuinely surprised. Yes, I still have to tweak, but we're talking entire features being 80% stubbed in sub 1 minute. More if I want it to test and iterate.
My major concern is the people doing this and not reviewing the code and shipping it. Because it definitely needs massaging...ESPECIALLY for security reasons.
Which tools are you finding success with?
The funny thing is that if AI coding were that good, we would already see widespread adoption in open source projects. But we haven't, because it sucks. Of course commercial software development companies are free to lie about how much they use AI, or get creative with their metrics so they can get their KPI bonuses. So we can't really believe anything they say. But we can believe in transparency.
As always, there are so many people selling snake oil by saying the word AI without actually telling you what they mean. Quite obviously there are a great many tools that one could call AI that can be and are and have been used to help do a ton of things, with many of those technologies going back decades. That's different from using ChatGPT to write your project. Whenever you hear someone write about AI and not give clear definitions, there's a good chance they're full of s***.
You can fucking swear on the internet
How do you know is not being used to develop open source code?
I have used AI assistance in many things, most of them are open sourced as I by default open source everything I make in my free time. The output code is indistinguishable, same as you wouldn't know if I asked my questions on how to do something on reddit, stackoverflow (rip) or other forum. You see the source, not the process I followed to make that source code. For all we know linux kernel devs might as well be asking chatgpt question, we wouldn't know.
As per explicit open source AI related tools there are hundreds. So I don't really know what you mean here that "open source projects" have not adopted AI. Do you mean like "vibe coding"?
I'm 90% sure it's something to do with the stock market, buy backs and companies having to do cryptic shit to keep up with a fake value to their shares
The money supply growth is far below the average, its tight monetary policy, so we are going to see a slowing job market.
My theory is that C-suites are actually using "AI efficiency gain" as an excuse for laying off workers without scaring the shareholders.
"I didn't lay off 10% of the workforce because the company is failing. It's because... uhmmmm... AI! I have replaced them with AI! Please give us more money."
It's the next RTO.