Hi Olasians,
As many of you probably know, I’m an avid Reddit user. I believe it’s still one of the few places on the internet where you can engage in civil discussions and learn from others, unlike the dumpster fire that X has become (I don’t even consider Facebook a social media platform anymore). Recently, there’s been some buzz about Reddit shutting down certain subreddits, and people have been flocking to the FreeSpeech subreddit to vent their frustrations.
First off, I support Reddit’s decision. One of the subreddits they closed was literally called “menshouldrape.” If you disagree with that move, please tell me why such vile content should be allowed in a public space. It’s absolutely despicable. Now, in some cases, the reasons for shutting down subreddits may not be as clear, which has led to a lot of complaints and outrage. What people often forget is that Reddit is a private company, and like any private platform, they can and should curate content—especially if it involves preventing the spread of harmful material like rape apologia.
A Reddit user summed this up well:
“Social media accounts are all about ‘your profile’ or ‘your photos’—but if you read the terms and conditions, you’ll quickly discover that anything uploaded becomes their content, not yours. These T&Cs are essential; without them, companies would have no control over what happens on their site.
A good analogy is a bar: the public can come in, socialize, buy drinks, and listen to music. You might write something on a beer mat or pick songs on the jukebox, but none of that gives you ownership of the bar, the beer mat, or the jukebox. And if you break the bar’s rules, they can ban you. If you’re banned, there are always other bars to go to. But if you’re banned from every bar in town—that’s a violation of your rights.
Social media works the same way. Getting banned from one site (or part of it) isn’t a violation of free speech, just like getting banned from one bar isn’t. Every social media platform has its own rules, and users agree to follow them when they sign up."
I have no objections to this—it is what it is. This is the reality when data transmitters and content editors are the same, as we’ve discussed in other thread. However, it raises interesting questions about how things would work on a decentralized platform like Olas, where data transmitters and content editors are distinctly separate.
What if someone decided to build a platform promoting harmful content, like rape, on top of Olas? Would the Olas governing body step in and disconnect it? Who would even be part of this governing body? And what if a judge orders a particular page on Olas to be shut down? Could that even be enforced? Right now, in many countries, internet providers comply with legal orders to cut off access to certain sites. I’m curious to hear if any of you have thought about how Olas would handle such situations and what approach you plan to take.