this post was submitted on 07 Sep 2024
1 points (100.0% liked)
TechTakes
1371 readers
24 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The learning facilitators they mention are the key to understanding all of this. They need them to actually maintain discipline and ensure the kids engage with the AI, so they need humans in the room still. But now roles that were once teachers have been redefined as "Learning facilitators". Apparently former teachers have rejoined the school in these new roles.
Like a lot of automation, the main selling point is deskilling roles, reducing pay, making people more easily replaceable (don't need a teaching qualification to be a "learning facilitator to the AI) and producing a worse service which is just good enough if it is wrapped in difficult to verify claims and assumptions about what education actually is. Of course it also means that you get a new middleman parasite siphoning off funds that used to flow to staff.
There is a lot of benefit to be had though. It will likely suck at first and I think the tendency for outsourcing this kind of thing is idiotic. The gov needs to be the AI administrator AND the company because AI is extremely privacy invasive and should never be commercialized in any capacity with kids. I don't support even the school having full access to a child's prompting. I say this because I have intimate knowledge of what kind of information can be accessed using this and how invasive it is. I only run my own open source models on my own offline hardware. The only persons within a school with full access to a child's prompting should be someone bound to confidentiality and a Hippocratic oath like a licensed psychiatrist with no obligations or bias towards the school's petty interests.
The education system is largely antiquated presently. I'm all for supporting my community with living wage jobs. Our reductionist culture is a big part of why we are falling apart. When we are presented with efficiency improvements, we are too stupid to adapt, and too stupid to use them as a resource. We flush out that newly created value instead of investing it immediately within ourselves.
The world has changed from an era when a traditional teacher is relevant. Audio visual information is our primary form of communication. With readily available video, it is criminal to continue live lecturing and presentation of static information. There is no chance that the live presentation of information is anywhere near the quality of a polished and edited video. There is very little chance that any given lecturer is truly the best at presenting such information. Such a statement glosses over the fact that there are an enormous range of personalities and functional thought processes. It is extremely unlikely that any given teacher connects well with each individual student. We have had readily available video communication for over a decade. Some university professors readily use the medium and offer class time as more of a workshop or lab environment. Most primary schools lack this kind of adoption of technology, complexity, and efficiency to keep up with the changing world. In truth, we lack the requirement for a teacher to be a life long learner too.
I expect much the same Luddism with AI. With teaching kids, this is pushing AI to the point where it needs serious supervision to be effective. Maintaining a child's autonomy and right to privacy is absolutely critical for the future of society as a whole. However, the ability for AI to adapt to any functional thought and help with individualized problem solving is something that no teacher is capable of with more than one student at a time.
Most of us had to persist through our frustration in order to learn. AI can directly and individually address that frustration and find a solution. It is not always correct, but it is in the same realm of accuracy as an above average teacher. Maybe you too were aware of just how many teachers did not even know the subjects they were tasked with teaching in primary school, I certainly was.
christ
it doesn’t do this
I’m sorry your teachers sucked bad enough you could replace them with a prerecorded video and a statistical language model that’s notorious for generating confident, dangerous lies. I don’t think most kids should have that kind of experience in school though, and if they are currently maybe we should do what it takes (funding, regulation, strikes) to not go in that direction.
The thing is, technology could absolutely play a huge role in advancing education, allowing students to approach material at their own pace and (algorithmically, not black box bullshit) adjusting problem sets to optimize their benefit from the learning.
But this is to free the actual teacher to spend their time one on one assisting students with areas where they need the extra attention. It's not to replace it with some unreliable bullshit machine.
(It should also probably be only part of the schedule. Various group settings have a bunch of value in a bunch of contexts both for the material and social stuff.) But you could absolutely enhance learning.
No, it can't.
Quod grātīs asseritur, grātīs negātur.
We've been using "video communication" to teach for half a century at least; Open University enrolled students in 1970. All the advantages of editing together the best performances from a top-notch professor, moving beyond the blackboard to animation, etc., etc., were obvious in the 1980s when Caltech did exactly that and made a whole TV series to teach physics students and, even more importantly, their teachers. Adding a new technology that spouts bullshit without regard to factual accuracy is necessarily, inevitably, a backward step.