We need a comparison against an average coder. Some fucking baseline ffs.
Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
"Self driving cars will make the roads safer. They won't be drunk or tired or make a mistake."
Self driving cars start killing people.
"Yeah but how do they compare to the average human driver?"
Goal post moving.
The issue is the powers that be arent comparing chatgpt to a reference manual. Rather another human being.
Why would we compare it against an average coder?
ChatGPT wants to be a coding aid/reference material. A better baseline would be the top rated answer for the question on stackoverflow or whether the answer exists on the first 3 Google search results.
Cause people arent looking at chatgpt as an accurate simple code generator, rather a junior dev.
Or a textbook's explanation
We need a comparison against an average coder. Some fucking baseline ffs.
Sure, but by randomly guessing code you'd get 0%. Getting 48% right is actually very impressive for an LLM compared to just a few years ago.
Exactly, I also find that it tends to do a pretty good job pointing you in the right direction. It's way faster than googling or going through sites like stackoverflow because the answers are contextual. You can ask about a specific thing you want to do, and and an answer that gives you a general idea of what to do. For example, I've found it to be great for crafting complex sql queries. I don't really care if the answer is perfect, as long as it gives me an idea of what I need to do.
You can also play with it to try and get closer to correct. I had problems with getting an Excel macro working and getting unattended-updates working on my pihole. GPT was wrong at first, but got me partly there and I could massage the question and Google and get closer to the right answer. Without it, I wouldn't have been able to get any of it, especially with the macro.
Just useful enough to become incredibly dangerous to anyone who doesn't know what they're doing. Isn't it great?
Now non-coders can finally wield the foot-gun once reserved only for coders! /s
Truth be told, computer engineering should really be something that one needs a licence to do commercially, just like regular engineering. In this modern era where software can be ruinous to someone's life just like shoddy engineering, why is it not like this already.
Look, nothing will blow up if I mess up my proxy setup on my machine. I just won't have internet until I revert my change. Why would that be different if I were getting paid for it?
Nothing happens if you fuck up your proxy, but if you develop an app that gets very popular and don't care about safety, so hackers are able to take control over your whole Server they can do a lot of damage. If you develop software for critical infrastructure it can actually cost human lives if you fuck up your security systems.
Yes, but people with master's degrees also fuck this up, so it's not like some accreditation system will solve the issue of people making mistakes
Yeah, but its probably more likely that the untaught might fuck up some stuff.
Is it, though? A lot of self-taught programmers do great work. I'm not sure this is true
Setting up proxy is not engineering.
I have to actually modify the code to properly package it for my distro, so it's engineering because I have to make decisions for how things work
I don't see how this supports your point then. If "setting up proxy" means "packaging it to run on thousands user machines" then isn't there obvious and huge potential for a disastrous fuckup?
No, because it either runs the program successfully, or it fails to launch. I don't mess with the protocol. It runs as root because it needs to set the iptables when turned on to be a "global" proxy
It’s pretty fun, interesting times ahead. I wonder what kind of bullshit will take place and can’t wait to see that lol. Between all the climate, ai, warmongering future won’t be boring guys that is certain. Unpack your popcorn
It's programming spell check
In the short term it really helps productivity, but in the end the reward for working faster is more work. Just doing the hard parts all day is going to burn developers out.
I program for a living and I think of it more as doing the interesting tasks all day, rather than the mundane and repetitive. Chat GPT and GitHub Copilot are great for getting something roughly right that you can tweak to work the way you want.
I think we must change the way we see AI. A lot of people see it as the holy grail of everything that can do everything we can do, even tho it can't. AI is a tool for humans to become more efficient in their work. It can do easy tasks for you and sometimes Assist you with harder stuff. It is the same as with Mathematicians and calculators. A good mathematician is able to calculate everytheverything he needs without a calculator, but the calculator makes him much more efficient at calculating stuff. The calculator didn't replace mathematicians, because you still have to know how to do the stiff you're doing.
You forgot the "at least" before the 52%.
I find it funny that thumbnail with a "fail" I'm actually surprised that it got 48% right.
Probably more than 52% of what programmers type is wrong too
We mostly suck in emails.
The one time it was helpful at work was when I used it to thank and wish a person well that left a company we work with. I couldn't come up with a good response and ChatGPT just spat real good stuff out in seconds. This is what it's really good for.
ChatGPT just spat real good stuff out in seconds
There's an entire episode of south park centered around this premise.
Yeah things that follow a kind of lexical "script" that you don't want to get creative with would be pretty easy to generate. Farewells, greetings, dear Johns, may he rest in peaces, etc etc
ChatGPT: I'm happy for you though, Or sorry that happened