LLMs should augment your skills not substitute them. That's just laziness or incompetence.
Or worst case scenario it means your job is replaceable.
This is a most excellent place for technology news and articles.
LLMs should augment your skills not substitute them. That's just laziness or incompetence.
Or worst case scenario it means your job is replaceable.
Why do people not review their LLMs output?
I had professor usi g worksheets watermarked by another professor at a college in another state, y'all think anything came of it? He also gave us all the answers to the tests in the form of self graded quizes and let us take them into tests.
HS diplomas became a joke, degrees are becoming a joke...
As someone who was a TA a bit, I think that is 99% because if schools tried to hold students accountable to the standards of even ten years ago they would have to fail 2/3rds of their students.
Highschool becoming a joke means none of the kids have strong enough core skills to be tackling real college work by the time they get there, but schools cant afford to enforce actual quality standards for work. The graded model has completely fallen apart at this point given how steep the curve is. The quality of work that gets an A today would have been a B or high C from 10-15 years ago. Of course there is real A grade work being done too, but what defines an A grade has ballooned to a ridiculous degree such that most of it is not really A grade work
The problem isnt new, it was already bad 10 years ago to be honest. I had a professor in community college about 10 years ago who had been a professor at ASU, and she had quit teaching there specifically because the university wouldnt allow anyone to be graded below a C, regardless of if they did any work or not.
Most large public universities are just degree mills at this point, or bordering on it if not
I'm currently doing an online Master's with Northeastern. Honestly not surprised this happened, the quality of classes is WILD.
Taking 2 classes per term, and each term so far 1 class has been very well designed but also insanely easy, while the other has been so poorly implemented that the course learning materials don't actually help you do the coursework.
Probably most astonishing so far though is a course I'm taking now just served me with the literally exact same assignment that I did for a course I just finished. Now, granted that both classes are from the elective course choices, so not everyone will take both, but come on... and they grill me about plagiarism with every submission I make...
For fucks sake people, it's not hard. AI can be useful to generate drafts or give suggestions, but ultimately everything has to be tweaked/written by an actual human expert. AI is a tool, not a product. If something isn't edited enough to have no trace of AI signature left, then you're being lazy and putting out garbage.
it's "hard" because every peddler of AI is pushing it exactly in the way you say, and I agree, is wrong
Time after time, I see people who should know better fail at basic things like this.
Even I don’t get called out for AI-written responses, even though a big number of my messages here are technically written by AI. The key difference is that I actually take the time to write a first draft of what I want to say, then run it through ChatGPT to help clean up my word salad - and finally, I go over the output again to make it sound like me. The thinking is mine. AI just helps me communicate more clearly.
I’d never ask it to write an entire response from scratch without providing structure or points I want to make. All I want is for the person reading my message to understand what I’m actually trying to say - so they can respond to that, not to a misinterpretation of what I was trying to say.
I'll just leave that first draft here to illustrate my point:
Time after time I see people that should know better to fail at basic things like this.
Even I don't get called out for AI responses even though a huge number of my messages posted here are technically written by AI. However, the difference here is that I actually took time to first write the first draft of what I want to say only then to give it for chatGPT to make sense of my word salad only for me to then go over it's output to make it sound like me again. The thinking is done by me - AI only helps me to communicate more clearly. I'd never ask it to write the entire response from ground up without providing any structure and points about what I want to say. All I want is the person reading this message to get as clear of an understanding as possible of what I'm trying to say so that they can respond to that rather than to misintrepretation of what I was trying to say.
This is it exactly. I use ChatGPT to double check things when I’m second guessing myself and I use it to make assignments.
Almost everytime, I need to tweak things but it turns 40 minutes of work into 5-10 minutes.
If I see a representative or senator using ChatGPT, could I demand that he resign from his position?