this post was submitted on 05 Feb 2024
2 points (100.0% liked)

Asklemmy

47759 readers
568 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 6 years ago
MODERATORS
 

Ok let's give a little bit of context. I will turn 40 yo in a couple of months and I'm a c++ software developer for more than 18 years. I enjoy to code, I enjoy to write "good" code, readable and so.

However since a few months, I become really afraid of the future of the job I like with the progress of artificial intelligence. Very often I don't sleep at night because of this.

I fear that my job, while not completely disappearing, become a very boring job consisting in debugging code generated automatically, or that the job disappear.

For now, I'm not using AI, I have a few colleagues that do it but I do not want to because one, it remove a part of the coding I like and two I have the feeling that using it is cutting the branch I'm sit on, if you see what I mean. I fear that in a near future, ppl not using it will be fired because seen by the management as less productive...

Am I the only one feeling this way? I have the feeling all tech people are enthusiastic about AI.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 1 year ago (2 children)

I disagree with the other posts here that you're overreacting. I think that AI will replace most jobs (maybe as high as 85% at some point). Consider becoming a plumber or an electrician. Until the robots will become commonplace in 20 years from now, you will have a job that AI won't be able to touch much. And people won't run out of asses or gaming. So they'll be stable professions for quite a while. You can still code in your free time, as a hobby. And don't cry for the lost revenue of being a programmer, because that will happen to everyone who will be affected by AI. You'll just have another job while the others won't. That's the upside.

I understand that this comment is not what people want to hear with their wishful thinking, so they'll downvote it. But I gotta say it how I see it. AI is the biggest revolution since the industrial revolution.

load more comments (2 replies)
[–] [email protected] 0 points 1 year ago

If you are truly feeling super anxious, feel free to dm me. Have released gen AI tech though admittedly only in that space for about a year and a half and... Ur good. Happy to get in depth about it but genuinely you are good for so many reasons that I'd be happy to expand upon.

Main point though for programmers will be it's expensive as fuck to get any sort of process going that will produce complex systems of code. And frankly I'm being a bit idealistic there. That's without even considering the amount of time. Love AI, but hype is massively misleading the reality of the tech.

[–] [email protected] 0 points 1 year ago

Imagine it's like having intern under you that helping you with everything, quality of the code will still be on you regardless

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (1 children)

I’m less worried and disturbed by the current thing people are calling AI than I am of the fact that every company seems to be jumping on the bandwagon and have zero idea how it can and should be applied to their business.

Companies are going to waste money on it, drive up costs, and make the consumer pay for it, causing even more unnecessary inflation.

As for your points on job security β€” your trepidation is valid, but premature, by numerous decades, in my opinion. The moment companies start relying on these LLMs to do their programming for them is the moment they will inevitably end up with countless bugs and no one smart enough to fix them, including the so-called AI. LLMs seem interesting and useful on the surface, and a person can show many examples of this, but at the end of the day, it’s regurgitating fed content based on rules and measures with knob-tuning β€” I do not yet see objective strong evidence that it can effectively replace a senior developer.

load more comments (1 replies)
[–] [email protected] 0 points 1 year ago

Give Copilot or similar a try. AI or similar is pretty garbage at the more complex aspects of programming, but it's great at simple boilerplate code. At least for me, that doesn't seem like much of a loss.

[–] [email protected] 0 points 1 year ago (1 children)

If you are afraid about the capabilities of AI you should use it. Take one week to use chatgpt heavily in your daily tasks. Take one week to use copilot heavily.

Then you can make an informed judgement instead of being irrationally scared of some vague concept.

[–] [email protected] 0 points 1 year ago (1 children)

Yeah, not using it isn't going to help you when the bottom line is all people care about.

It might take junior dev roles, and turn senior dev into QA, but that skillset will be key moving forward if that happens. You're only shooting yourself in the foot by refusing to integrate it into your work flows, even if it's just as an assistant/troubleshooting aid.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

It's not going to take junior dev roles) it's going to transform whole workflow and make dev job more like QA than actual dev jobs, since difference between junior middle and senior is often only with scope of their responsibility (I've seen companies that make junior do fullstack senior job while on the paper they still was juniors and paycheck was something between junior and middle dev and these companies is majority in rural area)

[–] [email protected] 0 points 1 year ago (1 children)

I'd like to thank you all for all your interesting comments and opinion.

I see a general trends not being too worried because of how the technology works.

The worrysome part being what capitalism and management can think but that's just an update of the old joke "A product manager is a guy that think 9 women can make a baby in 1 month". And anyway, if not that there will be something else, it's how our society is.

Now, I feel better, and I understand that my first point of view of fear about this technology and rejection of it is perhaps a very bad idea. I really need to start using it a bit in order to known this technology. I already found some useful use cases that can help me (get inspiration while naming things, generate some repetitive unit test cases, using it to help figuring out about well-known API, ...).

[–] [email protected] 0 points 1 year ago

Many have already touched on this, but you hit the nail on the head with the third paragraph. Always smart to prepare but any attempt to use this to reduce workers will go horribly. Saving isn't crazy in this regard but wouldn't plan on it being long term until LLMs can become less expensive, have better reasoning and most importantly have at all better performance on longer context windows without impact on performance. These aren't easy solves, they brush up on fundamentals limits of the tech

[–] [email protected] 0 points 1 year ago (1 children)

Have you seen the shit code it confidently spews out?

I wouldn't be too worried.

[–] [email protected] 0 points 1 year ago (1 children)

Well I seen, I even code reviewed without knowing, when I asked colleague what happened to him, he said "I used chatgpt, I'm not sure to understand what this does exactly but it works". Must confess that after code review comments, not much was left of the original stuff.

[–] [email protected] 0 points 1 year ago (3 children)

If I am going to poke small holes in the argument, the exact same thing happens every day when coders google a problem and find a solution on Stack Exchange or the like and copy/paste it into the code without understanding what it does. Yes, it was written initially by someone who understood it, but the end result is the exact same. Code that was implemented without understanding the inner workings.

load more comments (3 replies)
[–] [email protected] 0 points 1 year ago

So far it is mainly an advanced search engine, someone still needs to know what to ask it, interpret the results and correct them. Then there's the task of fitting it into an existing solution / landscape.

Then there's the 50% of non coding tasks you have to perform once you're no longer a junior. I think it'll be mainly useful for getting developers with less experience productive faster, but require more oversight from experienced devs.

At least for the way things are developing at the moment.

[–] [email protected] 0 points 1 year ago

Thought about this some more so thought I’d add a second take to more directly address your concerns.

As someone in the film industry, I am no stranger to technological change. Editing in particular has radically changed over the last 10 to 20 years. There are a lot of things I used to do manually that are now automated. Mostly what it’s done is lower the barrier to entry and speed up my job after a bit of pain learning new systems.

We’ve had auto-coloring tools since before I began and colorists are still some of the highest paid folks around. That being said, expectations have also risen. Good and bad on that one.

Point is, a lot of times these things tend to simplify/streamline lower level technical/tedious tasks and enable you to do more interesting things.

[–] [email protected] 1 points 1 year ago

I'm in IT and I don't believe this will happen for quite a while if at all. That said I wouldn't let this keep you up at night, it's out of your control and worrying about it does you no favours. If AI really can replace people then we are all in this together and we will figure it out.

[–] [email protected] -1 points 1 year ago

I'm not really losing any sleep over this myself. Current approach to machine learning is really no different from a Markov chain. The model doesn't have any understanding in a meaningful sense. It just knows that certain tokens tend to follow certain other tokens, and when you have a really big token space, then it produces impressive looking results.

However, a big part of the job is understanding what the actual business requirements are, translating those to logical steps, and then code. This part of the job can't be replaced until we figure out AGI, and we're nowhere close to doing that right now.

I do think that the nature of work will change, I kind of look at it as sort of doing a pair programming session. You can focus on what the logic is doing, and the model can focus on writing the boilerplate for you.

As this tech matures, I do expect that it will result in less workers being needed to do the same amount of work, and the nature of the job will likely shift towards being closer to a business analyst where the human focuses more on the semantics rather than implementation details.

We might also see new types of languages emerge that leverage the models. For example, I can see a language that allows you to declaratively write a specification for the code, and to encode constraints such as memory usage and runtime complexity. Then the model can bang its head against the spec until it produces code that passes it. If it can run through thousands of solutions in a few minutes, it's still going to be faster than a human coming up with one.

[–] [email protected] 1 points 1 year ago (1 children)

So, I asked Chat GPT to write a quick PowerShell script to find the number of months between two dates. The first answer it gave me took the number of days between them and divided by 30. I told it, it needs to be more accurate than that, so it wrote a while loop to add 1 months to the first date until it was larger than the 2 second date. Not only is that obviously the most inefficient way to do it, but it had no checks to ensure the one in the loop was actually smaller so you could just end up with zero. The results I got from co-pilot were not much better.

From my experience, unless there is existing code to do exactly what you want, these AI are not to the level of an experienced dev. Not by a long shot. As they improve, they'll obviously get better, but like with anything you have to keep up and adapt in this industry or you'll get left behind.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

The thing is that you need several AIs. One to write the question so the one who codes gets the question you want answered. The. A third one who will write checks and follow up on the code written.

When ran in a feedback loop like this, the quality you get out will be much higher than just asking chathpt to make something

[–] [email protected] 0 points 1 year ago (1 children)

I use AI heavily at work now. But I don't use it to generate code.

I mainly use it instead of googling and skimming articles to get information quickly and allow follow up questions.

I do use it for boring refactoring stuff though.

In its current state it will never replace developers. But it will likely mean you need less developers.

The speed at which our latest juniors can pick up a new language or framework by leaning on LLMs is quite astounding. It's definitely going to be a big shift in the industry.

At the end of the day our job is to automate things so tasks require less staff. We're just getting a taste of our own medicine.

[–] [email protected] 0 points 1 year ago (1 children)

I mainly use it instead of googling and skimming articles to get information quickly and allow follow up questions.

I do use it for boring refactoring stuff though.

Those are also the main uses cases I use it for.

Really good for getting a quick overview over a new topic and also really good at proposing different solutions/algorithms for issues when you describe the issue.

Doesn't always respond correctly but at least gives you the terminology you need to follow up with a web search.

Also very good for generating boilerplate code. Like here's a sample JSON, generate the corresponding C# classes for use with System.Text.Json.JsonSerializer.

Hopefully the hardware requirements will come down as the technology gets more mature or hardware gets faster so you can run your own "coding assistant" on your development machine.

[–] [email protected] -1 points 1 year ago

That's been my experience as well, it's faster to write a query for a model than to google and go through bunch of blogs or stackoverflow discussions. It's not always right, but that's also true for stuff you find online. The big advantage is that you get a response tailored to what you're actually trying to do, and like you said, if it's incorrect at least now you know what to look for.

And you can run pretrained models locally already if you have a relatively beefy machine. FauxPilot is an example. I imagine in a few years running local models is going to become a lot more accessible.

[–] [email protected] 1 points 1 year ago (4 children)

I'm not worried about AI replacing employees

I'm worried about managers and bean counters being convinced that AI can replace emplpyees

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

This is their only retaliation for the fact that managers have already been replaced by git tools and CI.

[–] [email protected] 0 points 1 year ago

That will happen. And if they're wrong, they'll crash and burn. That's how tech bubbles burst.

[–] [email protected] 0 points 1 year ago

Copilot is just so much faster than me at generating code that looks fancy and also manages to maximize the number of warnings and errors.

[–] [email protected] 0 points 1 year ago (2 children)

It'll be like outsourcing all over again. How many companies outsourced then walked back on it several years later and only hire in the US now? It could be really painful short term if that happens (if you consider severeal years to a decade short term).

[–] [email protected] 0 points 1 year ago (2 children)

Given the degree to which first-level customer service is required to stick to a script, I could see over half of call centers being replaced by LLMs over the next 10 years. The second level service might still need to be human, but I expect they could be an order of magnitude smaller than the first tier.

[–] [email protected] 0 points 1 year ago

I was a supervisor of a call center up until recently and yea, this is definitely coming. It's was already to the point where they were arguing with me about hiring enough people because soon we'll have an AI solution to take a lot of the calls. You can already see it in the chat bots coming out.

[–] [email protected] 0 points 1 year ago (5 children)

They're supposed to be on script but customers veer off the script constantly. They would be extremely annoyed to be talking to AI. Not that it would stop some companies but it would be terrible customer service.

load more comments (5 replies)
[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

Do you know any examples of which companies that have done this? I'm not asking to be facetiois, just genuinely curious.

load more comments
view more: β€Ή prev next β€Ί