this post was submitted on 24 Oct 2024
22 points (100.0% liked)
Technology
37719 readers
87 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Media literacy was never the problem, because it wasn't actual confusion about what was real or not that was drawing people to the extreme alt-right sphere, it was confirmation bias that allowed people to choose not to critically assess the content for veracity.
But I don't think you can solve this through "media ecology" either. Curating this volume of content is impossible, and there are legitimate dangers in giving the government too much ability to shut down free speech (see Germany condemning any form of pro-Palestinian rhetoric as antisemitic) in order to guard "truth".
I think that this is similar to the issue of biased GenAI; you can't fix bias at the human-output side, you have to build a society that doesn't want to engage with bigotry, and explore and question its own assumptions (and that's not ever a fixed state, it's an ongoing process).
I'd say that consuming content that only confirms your biases is a form of failing media literacy as you're failing to see the biases inherent to the content you're consuming. There's the flaw of "I think it's this way" seeking out that content and not seeing that it's poorly sourced (if at all) and eating it up.
Not to mention the role that content suggestion algorithms play in reinforcing that behavior and providing you with even more content along the lines of the biases you're reinforcing.
It's a complicated can of worms for sure.
You are underestimating people, I think. People choose their echo chambers because they understand that their positions are being challenged elsewhere. It's not an inability to see the bias in what they consume, it's a dislike of the alternative.
Every Trumper I talk to knows very well that Trump is unpopular, that Christian Nationalism is unpopular, that abortion rights are popular, etc, but they don't care, and they don't want to constantly be (rightfully) told and shown how dumb they are, so they wall themselves off in their gardens. "I'm just tired of hearing how bad Trump is all the time."
Agreed. We have already given more than enough control to the government in other areas of our lives. We now have alternative social platforms that give us a chance to actually have more direct control over our media landscape which hasn't been true in such a long time.
I think this is what they were trying to get across when they mention media ecology. They were pointing out how the structure of where media is shared and its sources can be more important for quashing disinformation than the actual content itself.
So when something is shared through YouTube there are certain pressures that over time mold the source of information into a specific format.
I'd say the same is true of the Fediverse as well. That's why its important we get the structure here right because it will determine what kind of platform this place turns into.
Edit: grammar
~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~