this post was submitted on 05 Feb 2024
2 points (100.0% liked)

Asklemmy

47759 readers
568 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 6 years ago
MODERATORS
 

Ok let's give a little bit of context. I will turn 40 yo in a couple of months and I'm a c++ software developer for more than 18 years. I enjoy to code, I enjoy to write "good" code, readable and so.

However since a few months, I become really afraid of the future of the job I like with the progress of artificial intelligence. Very often I don't sleep at night because of this.

I fear that my job, while not completely disappearing, become a very boring job consisting in debugging code generated automatically, or that the job disappear.

For now, I'm not using AI, I have a few colleagues that do it but I do not want to because one, it remove a part of the coding I like and two I have the feeling that using it is cutting the branch I'm sit on, if you see what I mean. I fear that in a near future, ppl not using it will be fired because seen by the management as less productive...

Am I the only one feeling this way? I have the feeling all tech people are enthusiastic about AI.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 1 year ago

I've been messing around with running my own LLMs at home using LM Studio and I've got so say it really helps me write code. I'm using Code Llama 13b, and it works pretty well as a programmer assistant. What I like about using a chatbot is that I go from writing code to reviewing it, and for some reason this keeps me incredibly mentally engaged. This tech has been wonderful for undoing some of my professional burnout.

If what keeps you mentally engaged does not include a bot, then I don't think you need any other reason to not use one. As much as I really like the tech, anyone that uses it is still going to need to know the language and enough about the libraries to fix the inevitable issues that come up. I can definitely see this tech getting better to the point of being unavoidable, though. You hear that Microsoft is planning on adding an AI button to their upcoming keyboards? Like that kind of unavoidable.

[–] [email protected] 0 points 1 year ago (1 children)

This might cheer you up: https://visualstudiomagazine.com/articles/2024/01/25/copilot-research.aspx

I don't think we have anything to worry about just yet. LLMs are nothing but well-trained parrots. They can't analyse problems or have intuitions about what will work for your particular situation. They'll either give you something general copied and pasted from elsewhere or spin you a yarn that sounds plausible but doesn't stand up to scrutiny.

Getting an AI to produce functional large-scale software requires someone to explain precisely the problem domain: each requirement, business rule, edge case, etc. At which point that person is basically a developer, because I've never met a project manager who thinks that granularly.

They could be good for generating boilerplate, inserting well-known algorithms, generating models from metadata, that sort of grunt work. I certainly wouldn't trust them with business logic.

[–] [email protected] 1 points 1 year ago

I think you raise a very good point about explaining the problem... Even us as "smart humans" have often great difficulty to see the point while reading PM specs...

[–] [email protected] -1 points 1 year ago
[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

I'm gonna sum up my feelings on this with a (probably bad) analogy.

AI taking software developer jobs is the same thinking as microwaves taking chefs jobs.

They're both just tools to help you achieve the same goal easier/faster. And sometimes the experts will decide to forego the tool and do it by hand for better quality control or high complexity that the tool can't do a good job at.

[–] [email protected] 0 points 1 year ago (1 children)

As a welder, I've been hearing for 20 years that "robots are going to replace you" and "automation is going to put you out of a job" yadda yadda. None of you code monkies gave a fuck about me and my job, but now it's a problem because it affects you and your paycheck? Fuck you lmao good riddance to bad garbage.

[–] [email protected] 0 points 1 year ago

Weirdly hostile, but ok. It's like any other tool that can be used to accelerate a process. Hopefully at some point it's useful enough to streamline the minutia of boring tasks that a competent intern could do. Not sure who is specifically targeting welders??

If it frees up your time to focus on more challenging stuff or stuff you enjoy, isn't that a good thing? Folks are dynamic and will adjust, as we always have.

Don't think there's a good excuse to come at someone with animosity over this topic.

[–] [email protected] 0 points 1 year ago

I’m a 50+ year old IT guy who started out as a c/c++ programmer in the 90’s and I’m not that worried.

The thing is, all this talk about AI isn’t very accurate. There is a huge difference in the LLM stuff that ChatGPT etc. are built on and true AI. These LLM’s are only as good as the data fed into them. The adage “garbage in, garbage out” comes to mind. Anybody that blindly relies on them is a fool. Just ask the lawyer that used ChatGPT to write a legal brief. The “AI” made up references to non-existent cases that looked and sounded legitimate, and the lawyer didn’t bother to check for accuracy. He filed the brief and it was the judge that discovered the brief was a work of fiction.

Now I know there’s a huge difference between programming and the law, but there are still a lot of similarities here. An AI generated program is only going to be as good as the samples provided to it, and you’re probably want a human to review that code to ensure it’s truly doing what you want, at the very least.

I also have concern that programming LLMs could be targeted by scammers and the like. Train the LLM to harvest sensitive information and obfuscate the code that does it so that it’s difficult for a human to spot the malicious code without a highly detailed analysis of the generated code. That’s another reason to want to know exactly what the LLM is trained on.

[–] [email protected] 0 points 1 year ago (1 children)
[–] [email protected] 1 points 1 year ago

I probably should have used llm to help me write a clearer question :D

[–] [email protected] 0 points 1 year ago (2 children)

I use GitHub Copilot from work. I generally use Python. It doesn't take away anything at least for me. It's big thing is tab completion; it saves me from finishing some lines and adding else clauses. Like I'll start writing a docstring and it'll finish it.

Once in a while I can't think of exactly what I want so I write a comment describing it and Copilot tries to figure out what I'm asking for. It's literally a Copilot.

Now if I go and describe a big system or interfacing with existing code, it quickly gets confused and tends to get in the weeds. But man if I need someone to describe a regex, it's awesome.

Anyways I think there are free alternatives out there that probably work as well. At the end of the day, it's up to you. Though I'd so don't knock it till you try it. If you don't like it, stop using it.

[–] [email protected] 0 points 1 year ago

This. I've seen SO much hype and FUD and all the while there are thousands of developers grinding out code using these tools.

Does code quality suffer? ONLY in my experience if they have belt wielding bean counters forcing them to ship well before it's actually ready for prime time :)

The tools aren't perfect, and they most DEFINITELY aren't a panacea. The industry is in a huge contraction phase right now so I think we have a while before we have to worry about AI induced layoffs, and if that happens the folks doing the laying off are being incredibly short sighted and likely to have a high impact date with a wall coming in the near future anyway.

[–] [email protected] 0 points 1 year ago

Its* big thing

[–] [email protected] 0 points 1 year ago

It doesn't matter what you think about AI. It's very clear that this technology is here to stay and will only improve. From this point on AI will become deeply integrated into human culture and technology, after all we've been fetishizing it for almost 100 years now. Your only logical option as a developer is to learn how to use it and abuse it. Choosing not to do so is career suicide, possibly even societal suicide depending on how quickly adoption happens.

You're probably right, in the near future people that can't use it will be fired. To that point they should be fired. Why the fuck would I allow my accounts to do their financal work on paper when Excel exists?

Welcome to the future.

[–] [email protected] 0 points 1 year ago (1 children)

i'm still in uni so i can't really comment about how's the job market reacting or is going to react to generative AI, what i can tell you is it has never been easier to half ass a degree. any code, report or essay written has almost certainly came from a LLM model, and none of it makes sense or barely works. the only people not using AI are the ones not having access to it.

i feel like it was always like this and everyone slacked as much as they could but i just can't believe it, it's shocking. lack of fundamental and basic knowledge has made working with anyone on anything such a pain in the ass. group assignments are dead. almost everyone else's work comes from a chatgpt prompt that didn't describe their part of the assignment correctly, as a result not only it's buggy as hell but when you actually decide to debug it you realize it doesn't even do what its supposed to do and now you have to spend two full days implementing every single part of the assignment yourself because "we've done our part".

everyone's excuse is "oh well university doesn't teach anything useful why should i bother when i'm learning ?" and then you look at their project and it's just another boilerplate react calculator app in which you guessed it most of the code is generated by AI. i'm not saying everything in college is useful and you are a sinner for using somebody else's code, indeed be my guest and dodge classes and copy paste stuff when you don't feel like doing it, but at least give a damn on the degree you are putting your time into and don't dump your work on somebody else.

i hope no one carries this kind of sentiment towards their work into the job market. if most members of a team are using AI as their primary tool to generate code, i don't know how anyone can trust anyone else in that team, which means more and longer code reviews and meetings and thus slower production. with this, bootcamps getting more scammy and most companies giving up on junior devs, i really don't think software industry is going towards a good direction.

[–] [email protected] 0 points 1 year ago

I think I will ask people if they use AI to write code when I am interviewing them for a job and reject anyone who does.

[–] [email protected] 0 points 1 year ago

AI allows us to do more with less just like any other tool. It's no different than an electric drill or a powered saw. Perhaps in the future we will see more immersive environment games because much of the immersive environment can be made with AI doing the grunt work.

[–] [email protected] 0 points 1 year ago (1 children)

I'm a composer. My facebook is filled with ads like "Never pay for music again!". Its fucking depressing.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

Good thing there's no Spotify for sheet music yet... I probably shouldn't give them ideas.

[–] [email protected] 0 points 1 year ago (1 children)

As someone with deep knowledge of the field, quite frankly, you should now that AI isn't going to replace programmers. Whoever says that is either selling a snake oil product or their expertise as a "futurologist".

[–] [email protected] 0 points 1 year ago (3 children)

Could you elaborate? I don't have a deep knowledge of the field, I only write rudimentary scripts to make some ports of my job easier, but from the few videos on the subject that I saw, and from the few times I asked AI to write a piece of code for me, I'd say I share the OP's worry. What would you say is something that humans add to programming that can't (and can never be) replaced by AI?

[–] [email protected] 1 points 1 year ago (1 children)

Generative neural networks are the latest tech bubble, and they'll only be decreasing in quality from this point on as the human-generated text used to train them becomes more difficult to access.

One cannot trust the output of an LLM, so any programming task of note is still going to require a developer for proofreading and bugfixing. And if you have to pay a developer anyway, why bother paying for chatgpt?

It's the same logic as Tesla's "self-driving" cars, if you need a human in the loop then it isn't really automation, just sparkling cruise control that isn't worth the price tag.

I'm really looking forward to the bubble popping this year.

[–] [email protected] 2 points 1 year ago

This year? Bold prediction.

[–] [email protected] 0 points 1 year ago (1 children)

It can't reason. It can't write novel high quality, high complexity code. It can only parrot what other had said.

[–] [email protected] 0 points 1 year ago (1 children)

90% of code is something already solved elsewhere though.

[–] [email protected] 0 points 1 year ago (1 children)

AI doesn't know if the code copied is correct. It will stright up hallucinate non existing libraries just because they seem to look good at first glance.

[–] [email protected] 0 points 1 year ago

Depends on how you set it. A RAG LLM verifies up against a set of sources, so that would be very unlikely in state of the art.

[–] [email protected] 0 points 1 year ago (1 children)

I think the need for programmers will always be there, but there might be a transition towards higher abstraction levels. This has actually always been happening: we started with much focus on assembly languages where we put in machine code, but nowadays a much less portion of programmers are involved in those and do stuff in python, java or whatever. It is not essential to know stuff about garbage collection when you are writing an application, because the compiler already does that for you.

Programmers are there to tell a computer what to do. That includes telling a computer how to construct its own commands accordingly. So, giving instructions to an AI is also programming.

[–] [email protected] 0 points 1 year ago

Yeah that's what I was just thinking. Once we somehow synthesize this LLM into a new type of programming language it gets interesting. Maybe a more natural language that gets the gist of what you are trying to do. And then a unit test to see if it works. And then you verify. Not sure if that can work.

TBH I'm a bit shocked that programmers are already using AI to generate programming, I only program as a hobby any more. But it sounds interesting. If I can get more of my ideas done with less work I'd love it.

I think fundamentally, philosophically there are limits. Ultimately you need language to describe what you want to do. You need to understand the problem the "customer" has and formulate a solution and then break it down into solvable steps. AI could help with that but fundamentally it's a question of describing and the limits of language.

Or maybe we'll see brain interfaces that can capture some of the subtleties of intend from the programmer.

So maybe we'll see the productivity of programmers rise by like 500% or something. But something tellse me (Jevons paradox) the economy would just use that increased productivity for more apps or more features. But maybe the needed qualifications for programmers will be reduced.

Or maybe we'll see AI generating programming libraries and development suits that are more generalized libraries. Or like existing crusty libraries rewritten to be more versatile and easier to use by AI powered programmers. Maybe AI could help us create a vast library of more abstract / standard problem+solutions.

[–] [email protected] 0 points 1 year ago

Your fear is in so far justified as that some employers will definitely aim to reduce their workforce by implementing AI workflow.

When you have worked for the same employer all this time, perhaps you don't know, but a lot of employers do not give two shits about code quality. They want cheap and fast labour and having less people churning out more is a good thing in their eyes, regardless of (long-term) quality. May sound cynical, but that is my experience.

My prediction is that the income gap will increase dramatically because good pay will be reserved for the truly exceptional few. While the rest will be confronted with yet another tool capitalists will use to increase profits.

Maybe very far down the line there is blissful utopia where no one has to work anymore. But between then and now, AI would have to get a lot better. Until then it will be mainly used by corporations to justify hiring less people.

[–] [email protected] 0 points 1 year ago

I love llms! I'm using them to answer all sorts of bullshit to become a manager....like here's a bunch of notes make me a managers review of Brian. LOL.

I think Google is struggling to control the bullshit flood from the Internet and so AI is about to eat their lunch. Like I already decided that all AIs are just bullshit and the only really useful AIs are the ones that can actually search the Internet live. Perplexity AI was doing this for a while but someone chopped off it's balls. I've been looking for a replacement ever since.

I also use it for help with python, with Linux, with docker, with solid works and stuff around the house like taxes, kombucha, identifying plants and stupid stuff like that.

But I can definitely see the future when the police are replaced with robo dogs with lases heads that can run at 120mph and shoot holes through cars. The only benefit being that the hole doesn't get infected and there's no pool of blood. That future is coming. I'm going to start wearing aluminum reflective shield armor.

[–] [email protected] 0 points 1 year ago (1 children)

I wish your fear were justified! I'll praise anything that can kill work.

Hallas, we're not here yet. Current AI is a glorified search engine. The problem it will have is that most code today is unmaintainable garbage. So AI can only do this for now : unmaintainable garbage.

First the software industry needs to properly industrialise itself. Then there will be code to copy and reuse.

[–] [email protected] 0 points 1 year ago (1 children)

I’ll praise anything that can kill work under UBI. Without reform, I worry the rich will get richer, the poor will get even poorer and it leads to guillotines in the square.

[–] [email protected] 0 points 1 year ago (1 children)

Under capitalism the rich will get richer, and the poor poorer. That's the whole point of it. Guillotines are a solution to get UBI.

[–] [email protected] 0 points 1 year ago (1 children)

Your last sentence is where I fear we will end up. The very wealthy would be wise to realise it and work reform themselves.

I disagree that capitalism, at least in the way I understand it, always leads to rich getting richer, poor getting poorer. Many European countries have a happy medium that rewards risk-taking while looking after everyone. While most still slowly get worse on the Gini coefficient it’s based on pretty much the 0.1% pulling away and away, while the rest of their societies actually stays roughly the same. So really they only have the top of the top of the top to deal with, whereas a country like the US has a much larger, all-encompassing inequality.

[–] [email protected] 0 points 1 year ago (1 children)

All countries of Europe are going fascists one after the other. Why if there is no problem?

Europe had capitalism under a leash because communism was here to threaten it. Since the 90's, capitalism is unleashed and inequalities are rising. USA didn't had communism to tame its capitalism, because it was basically forbidden because of the cold war.

Capitalism is entirely focused on having companies making a profit. If you don't have strong states to tame it and redistribute the money, inequalities increase. It's mathematical.

[–] [email protected] 0 points 1 year ago

The raw is fascism has more to do with people’s impression of immigration than it does capitalism.

Inequality in Europe isn’t rising if you disregard the top 0.1%. It’s the very very top that needs adjusting in Europe.

I agree with your last paragraph. Of course you need rules and redistribution. That doesn’t mean that capitalism, if well regulated, isn’t the most productive or the most effective at increasing wealth for everyone.

[–] [email protected] 0 points 1 year ago

🙄 no I'm sure you're the only one

[–] [email protected] 0 points 1 year ago

Programming is the most automated career in history. Functions / subroutines allow one to just reference the function instead of repeating it. Grace Hopper wrote the first compiler in 1951; compilers, assemblers, and linkers automate creating machine code. Macros, higher level languages, garbage collectors, type checkers, linters, editors, IDEs, debuggers, code generators, build systems, CI systems, test suite runners, deployment and orchestration tools etc... all automate programming and programming-adjacent tasks, and this has been going on for at least 70 years.

Programming today would be very different if we still had to wire up ROM or something like that, and even if the entire world population worked as programmers without any automation, we still wouldn't achieve as much as we do with the current programmer population + automation. So it is fair to say automation is widely used in software engineering, and greatly decreases the market for programmers relative to what it would take to achieve the same thing without automation. Programming is also far easier than if there was no automation.

However, there are more programmers than ever. It is because programming is getting easier, and automation decreases the cost of doing things and makes new things feasible. The world's demand for software functionality constantly grows.

Now, LLMs are driving the next wave of automation to the world's most automated profession. However, progress is still slow - without building massive very energy expensive models, outputs often need a lot of manual human-in-the-loop work; they are great as a typing assist to predict the next few tokens, and sometimes to spit out a common function that you might otherwise have been able to get from a library. They can often answer questions about code, quickly find things, and help you find the name of a function you know exists but can't remember the exact name for. And they can do simple tasks that involve translating from well-specified natural language into code. But in practice, trying to use them for big complicated tasks is currently often slower than just doing it without LLM assistance.

LLMs might improve, but probably not so fast that it is a step change; it will be a continuation of the same trends that have been going for 70+ years. Programming will get easier, there will be more programmers (even if they aren't called that) using tools including LLMs, and software will continue to get more advanced, as demand for more advanced features increases.

load more comments
view more: next ›