On section 230 and what that means for freedom of speech - can it be solved with decentralized protocols?

Hey, everyone!

I’m back with something interesting I recently came across. I was reading a post on the Freedom of Speech subreddit about Section 230 and whether it should be repealed in the U.S. It’s a fascinating topic because it touches on many issues affecting online media today (many already covered on this forum). But first, some context.

What exactly is Section 230? It’s part of the Communications Decency Act (1996), and it gives online platforms immunity from being held liable for user-generated content. Essentially, it allows platforms to moderate content without being treated as the “publisher.” In simpler terms, platforms like Facebook are viewed as carriers of information (similar to a phone company), so they aren’t held responsible for the content they host.

That’s a huge deal because it enables platforms to host a wide range of user-generated content without the fear of endless lawsuits. The traditional press doesn’t get that luxury. Without Section 230, platforms might end up censoring or blocking controversial content to avoid legal risks.

Now, here’s where the debate comes in. Some people argue for keeping Section 230, saying it protects free speech. They believe that without it, smaller platforms wouldn’t be able to handle the legal risks, and big companies would tighten censorship, shrinking the diversity of online discourse. This, they argue, would harm both free speech and innovation. Of course, there’s also the political side of things, with the MAGA crowd claiming that repealing Section 230 is a Democratic scheme to control the media—like in North Korea or something. I don’t buy into that extreme rhetoric, but I do agree that repealing Section 230 could make it harder for non-mainstream views to get out there. And I’m not talking about illegal stuff like child pornography, but alternative opinions on topics like climate change, gender, and other hot-button issues. Whether you agree with these views or not, they deserve a chance to be aired and publicly debated.

On the flip side, there are people pushing to repeal Section 230. From what I’ve seen, there are two main arguments for this:

  1. Unfair protection for certain companies: Platforms get liability protections like common carriers but still act like publishers with editorial control. This gives them an edge over traditional publishers who don’t have the same legal shield. That creates an uneven playing field, which wasn’t the original intention when Section 230 was passed.
  2. Amplification of misinformation: Critics like Roger McNamee argue that Section 230 allows platforms to spread fake news and low-quality content as long as it drives engagement and boosts ad revenue. They believe repealing or reforming Section 230 could help curb the spread of misinformation.

It’s a complex issue, no doubt. Personally, I think both sides have valid points—leaving the political mudslinging aside. We need to hold platforms accountable for the content they profit from, especially since most social media companies make billions in ad revenue and are not just neutral carriers. They actively curate content through algorithms. But at the same time, we need to ensure that controlling content doesn’t lead to a more restricted internet, where unpopular or controversial opinions are stifled.

In my opinion, the real issue arises when platforms are both carriers and editors—which is what’s happening now. Meta, Google, TikTok, and others not only own the infrastructure to share content (the feed) but also manipulate it through algorithms. That’s a problem we probably didn’t anticipate. It’s like a phone company being able to decide what conversations you can have. Not ideal.

However, I believe there’s some hope. Decentralized technologies could help us separate these two roles. For example, we could create an open, decentralized protocol that functions as a neutral carrier—anyone can use it for anything. Then, whatever is built on top of that protocol would be the editor, responsible for the content shared. So if someone posts illegal content, like nazi speech in Germany or child pornography in the U.S., they would be held accountable, but the neutral protocol wouldn’t. So, nobody could hide behind Section 230, while we preserve free access to the Internet infrastructure for everybody.

That seems like a promising solution. So far, big tech giants have controlled everything, and it’s not working out too well. Decentralized networks could provide a better way forward by placing responsibility where it belongs, without creating the kind of chaos we have now.

By the way, this is exactly what Bluesky is doing. It’s a social network built as a client on the federated open protocol, atproto, and it’s not the only one—other clients are in the works too. I believe Olas is moving in this direction as well, and I think that’s a fantastic idea! I’m really excited to see how things develop over the next few months.

3 Likes

Hey B, nice to see you posting again!

I totally get where you’re coming from with the idea that the platforms built on top of a decentralized protocol should be responsible for the content shared. But honestly, I think that would still lead to censorship, like what we’re seeing now.

You mentioned Section 230(c)(2) allowing platforms to remove content they find “objectionable,” and that’s exactly where the gray area comes in. It’s tricky because what one platform sees as objectionable might just be a controversial opinion someone else wants to debate. I mean, of course, stuff like child porn or harmful speech should never be allowed, but “objectionable” can be way too subjective.

I really think moderation is important, but it needs to be super clear and based on established guidelines, not left up to interpretation. Otherwise, we’re just going to keep seeing the same issues with people being censored for no clear reason.

Maybe it’s naive, but I’m all for an incentive system that rewards truthful, verifiable facts, with the community doing some of the fact-checking. If you can build a system where everyone benefits from being truthful, I think that could work way better in the long run.

2 Likes

Jumping in here—interesting points you both have. But honestly, why even bother with all these restrictions? The whole idea of moderation and gatekeeping just leads to more control, no matter how you spin it. If we’re serious about free speech, we need a free-for-all. Let people post whatever they want, and let the public decide what’s worth engaging with. Yes, that means some bad stuff will slip through, but trying to control it only concentrates power and silences voices that don’t fit the mainstream narrative.

Moderation systems always end up biased. The best way to counter misinformation? More speech, not less. Let the community handle it. If something’s wrong, people will call it out. Free speech should mean free speech, not “free speech within boundaries.”

1 Like

Great breakdown on Section 230! It’s such a complex topic, and you hit on some key issues—especially how social media platforms act as both “carriers” and “editors.” Totally agree that it’s a messy situation that centralized platforms haven’t handled well.

Love the idea of decentralized protocols as a solution! They could separate content responsibility from infrastructure ownership, making things a lot clearer. Platforms like Bluesky and maybe even Olas could pave the way for a more balanced, transparent approach. Let’s see where it goes!

1 Like

I hear you, but I think a free-for-all is way too utopic, man. In that kind of scenario, the ones with the most influence or resources will just end up controlling the narrative, and they won’t be held accountable for it. We’d probably see herd mentality take over, with the crowd following whatever gets the most attention, regardless of whether it’s true.
What we need is a proper reward and incentive system where moderation and accountability come from within the community. That way, bad actors would naturally get filtered out as people call them out or ignore them. It’s not about letting chaos reign; it’s about making sure there’s a balance between free expression and accountability. The system has to evolve, but it can’t be a total free-for-all.

A free-for-all approach is a bad idea because it would lead to the same issues we’re currently facing: coordinated efforts to turn open discussion spaces into political platforms for spreading fake news and disinformation. To be honest, freedom of speech won’t be compromised by editorialized platforms built on top of open protocols. If you disagree with a service’s editorial stance, simply create your own and let people decide what they prefer.

Additionally, editorial oversight is necessary from a customer perspective. Browsing and curating all the information available online is impractical for most of us—it’s practically a full-time job. That’s why people follow and trust editors: they curate content that aligns with their interests. Naturally, editors will have an editorial line, but that’s part of the package. Most readers understand and accept that.

Ultimately, the problem arises when “carriers” also act as “editors,” creating a market distortion fostered by Web2 technologies. Open, decentralized protocols have the potential to change this dynamic for the better.

Bluesky, in my opinion, is a successful experiment in this regard, and many others will follow. I just want to share an interview with the CEO, where she shares intriguing ideas about the future of social media as federations rather than centralized platforms. It’s definitely worth reading: Federation is the future of social media, says Bluesky CEO Jay Graber - The Verge

I hope you enjoy it. Maybe I will create a new topic out of it. But only if you like it :smiley:

I agree with this and think Blue Sky is a great project! Reminds me of the XMPP protocol (also known as Jabber).

2 Likes