Microblog Memes
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
In terms of grade school, essay and projects were of marginal or nil educational value and they won't be missed.
Until the last 20 years, 100% of the grade for medicine was by exams.
This is fair if you're just copy-pasting answers, but what if you use the AI to teach yourself concepts and learn things? There are plenty of ways to avoid hallucinations, data-poisoning and obtain scientifically accurate information from LLMs. Should that be off the table as well?
We weren't verifying things with our own eyes before AI came along either, we were reading Wikipedia, text books, journals, attending lectures, etc, and accepting what we were told as facts (through the lens of critical thinking and applying what we're told as best we can against other hopefully true facts, etc etc).
I'm a Relaxed Empiricist, I suppose :P Bill Bailey knew what he was talking about.
Well that disqualifies 95% of the doctors I've had the pleasure of being the patient of in Finland.
It's just not LLM:'s they're addicted to, it's bureaucracy.
but elected president..... you SOB, I'm in!
My hot take on students graduating college using AI is this: if a subject can be passed using ChatGPT, then it's a trash subject. If a whole course can be passed using ChatGPT, then it's a trash course.
It's not that difficult to put together a course that cannot be completed using AI. All you need is to give a sh!t about the subject you're teaching. What if the teacher, instead of assignments, had everyone sit down at the end of the semester in a room, and had them put together the essay on the spot, based on what they've learned so far? No phones, no internet, just the paper, pencil, and you. Those using ChatGPT will never pass that course.
As damaging as AI can be, I think it also exposes a lot of systemic issues with education. Students feeling the need to complete assignments using AI could do so for a number of reasons:
-
students feel like the task is pointless busywork, in which case a) they are correct, or b) the teacher did not properly explain the task's benefit to them.
-
students just aren't interested in learning, either because a) the subject is pointless filler (I've been there before), or b) the course is badly designed, to the point where even a rote algorithm can complete it, or c) said students shouldn't be in college in the first place.
Higher education should be a place of learning for those who want to further their knowledge, profession, and so on. However, right now college is treated as this mandatory rite of passage to the world of work for most people. It doesn't matter how meaningless the course, or how little you've actually learned, for many people having a degree is absolutely necessary to find a job. I think that's bullcrap.
If you don't want students graduating with ChatGPT, then design your courses properly, cut the filler from the curriculum, and make sure only those are enrolled who are actually interested in what is being taught.
Your 'design courses properly' loses all steam when you realize there has to be an intro level course to everything. Show me math that a computer can't do but a human can. Show me a famous poem that doesn't have pages of literary critique written about it. "Oh, if your course involves Shakespeare it's obviously trash."
The "AI" is trained on human writing, of course it can find a C average answer to a question about a degree. A fucking degree doesn't need to be based on cutting edge research - you need a standard to grade something on anyway. You don't know things until you learn them and not everyone learns the same things at the same time. Of course an AI trained on all written works within... the Internet is going to be able to pass an intro level course. Or do we just start students with a capstone in theoretical physics?
AI is not going to change these courses at all. These intro courses have always had all the answers all over the internet already far before AI showed up, at least at my university they did. If students want to cheat themselves out of those classes, they could before AI and will continue to do so after. There will always be students who are willing to use those easier intro courses to better themselves.
These intro courses have always had all the answers all over the internet already far before AI showed up, at least at my university they did.
I took a political science class in 2018 that had questions the professor wrote in 2010.
And he often asked the questions to be answered before we got them in the class. So sometimes I'd go "what the fuck is he referencing? This wasn't covered. It's not in my notes."
And then I'd just check the question and someone already had the answers up from 2014.
The problem is that professors and teachers are being forced to dumb down material. The university gets money from students attending, and you can’t fail them all. It goes with that college being mandatory aspect.
Even worse at the high school level. They put students who weren’t capable of doing freshman algebra in my advanced physics class. I had to reorient the entire class into “conceptual/project based learning” because it was clearly my fault when they failed my tests. (And they couldn’t be bothered turning in the products either).
To fail a student, I had to have the parents sign a contract and agree to let them fail.
Yes if people aren't interested in the class or the schooling system fails the teacher or student, they're going to fail the class.
That's not the fault of new "AI" things, that's the fault of (in America) decades of underfunding the education system and saying it's good to be ignorant.
I'm sorry you've had a hard time as a teacher. I'm sure you're passionate and interested in your subject. A good math teacher really explores the concepts beyond "this is using exponents with fractions" and dives into the topic.
I do say this as someone who had awful math teachers, as a dyscslculic person. Made a subject I already had a hard time understanding boring and uninteresting.
Who's gonna grade that essay? The professor has vacation planned.
I'm unsure if this is a joke or not, I apologize.
A good use I've seen for AI (or particularly ChatGPT) is employee reviews and awards (military). A lot of my coworkers (and subordinates) have used it, and it's generally a good way to fluff up the wording for people who don't write fluffy things for a living (we work on helicopters, our writing is very technical, specific, and generally with a pre-established template).
I prefer reading the specifics and can fill out the fluff myself, but higher-ups tend to want "how it benefitted the service" and fitting in the terminology from the rubric.
I don't use it because I'm good at writing that stuff. Not because it's my job, but because I've always been into writing. I don't expect every mechanic to do the same, though, so having things like ChatGPT can make an otherwise onerous (albeit necessary) task more palatable.
Okay but I use AI with great concern for truth, evidence, and verification. In fact, I think it has sharpened my ability to double-check things.
My philosophy: use AI in situations where a high error-rate is tolerable, or if it's easier to validate an answer than to posit one.
There is a much better reason not to use AI -- it weakens one's ability to posit an answer to a query in the first place. It's hard to think critically if you're not thinking at all to begin with.
It’s funny how everyone is against using AI for students to get summaries of texts, pdfs etc which I totally get.
But during my time through medschool, I never got my exam paper back (ever!) so the exam was a test where I needed to prove that I have enough knowledge but the exam is also allowed to show me my weaknesses are so I would work on them but no, we never get out papers back. And this extends beyond medschool, exams like the USMLE are long and tiring at the end of the day we just want a pass, another hurdle to jump on.
We criticize students a lot (righfully so) but we don’t criticize the system where students only study becase there is an exam, not because they are particularly interested in the topic at given hand.
A lot of topics that I found interesting in medicine were dropped off because I had to sit for other examinations.
because doing that enables pulling together 100% correct answers and leads to cheating? having a exam review where you get to see the answers but not keep the paper might be one way to do this?
galileosballs is the last screw holding the house together i swear