this post was submitted on 25 Jan 2024
1 points (100.0% liked)
Asklemmy
43906 readers
1022 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy π
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The chatbot that's wrong 50% of the time? That's hard to believe.
I can't wait to be starving
There's no way capitalism and those that buy into it can responsibly use AI
Maybe people should learn to fight for a better tomorrow instead of just trying to get through today
And how many major corporations that invested in NFTs are still doing so?
I'm not doing that at all, this is my personal experience.
Clearly the most reasonable response. Did you work for apple back in the day?
It doesn't need to be right to make money, often more money than companies get by paying people to do a job properly.
... Yes, it does in the tech sector. If you're wrong it doesn't work.
I've tried the tools out. You go from writing code for an hour and debugging for half an hour to writing code for 15 minutes and debugging for three hours.
Half the time you've ripped out literally every bit of code the AI wrote by the time you're done making it work.
Imagine training what's going to replace you lol
Nah, I have standards
Oh, I know how to prompt AI. Getting it to spit out workable code doesn't mean you don't have to review the code, or make sure it's integrated correctly.
You also have to make sure it's not generating blatantly braindead code, which makes the review and debugging cycle take longer.
I remain unconvinced that it's suitable for domains where there is a right and wrong answer, like engineering or law.
I've found more value in the systems that do a good job understanding the problem description and then returning references to documentation and prior art on techniques, as opposed to the actual code.
I don't need a virtual junior dev I need to hand hold, I actually have those and mine get better. I want a virtual "person who worked on something like this once and knows the links to the good articles".
What's your metric that you improved 600%?
I mean, I can imagine it. It's the industrial revolution all over again but Cyberpunk style.