Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, not exactly. Unlike in the print age, we all have the ability to set up a home server or distributed servers; we have the ability to reach anyone in the world. But we don't have a natural right to do that or extend our readership through someone else's platform. I can sit here and host whatever I'd like. In the BBS era, you were responsible for people posting illegal stuff on your board, and you were held responsible if your board was full of some bullshit propaganda, and it should still be that way. Zuckerberg should be treated like any asshole sysop who isn't tending his domain.

Hacker News does, to a large degree, moderate what people write here. Much more so than any social media platform. That's why it's still a functional platform. You can't just go on here and slander or espouse libelous conspiracy theories. You won't get far.

I'm not saying the gov't should regulate it! Not at all. I'm saying the content distribution networks shouldn't be shielded from civil litigation. HN and Facebook and Twitter aren't general carriers. If they simply delivered messages from point A to B, they might not be liable; but the more a service like FB news feed chooses to re-distribute something, the more they should be held liable for content. Simply putting it on top and letting it be downvoted, like HN, is far less nefarious than re-targeting it and repeating it relentlessly to the most vulnerable 15% of dumbasses who'll believe it. One person posting and having it sink isn't a big deal. But re-targeting that post to others makes you a re-poster. They aren't the postal service. They aren't neutral. Re-posting lies has monetary, reputational and social costs. Facebook profits directly from massively re-posting lies, but bears none of the cost. All I'm saying is that no one in a position to determine what's re-posted or not should be allowed to profit from spreading disinformation with one hand while externalizing those costs onto society, without a neutral arbitration willing to reassess the cost to individuals they harmed.

[edit] The current system can't support free speech, because shielding the re-printers of false speech makes it impossible to disentangle truth from propaganda; and in an asymmetrical field, propaganda always wins. Letting the courts sort it out and putting the social media networks on notice that they were responsible for veracity would solve this silly debate over whether "free speech" is being quashed on private networks, and also, encourage better forms of debate that conformed to certain standards of logic. And if you don't want to conform to any kind of logic, you can always set up your own server.



The established social media sites have lawyers on retainer and will appeal every single case up to the Supreme Court if necessary, and as slowly as they like, just to slow roll those submitting civil suits.

Opening up user-submitted content sites to civil damages would ensure that the largest social networks are the only ones that can afford to fight these cases in court. Smaller sites would become self-censored, even if they weren’t likely to be a target of a civil suit. This would lead to a further entrenchment of the largest social media sites, as they would go to bat for their users at least some of the time. They have to be seen to support their users, or else they would just find an alternative host that isn’t subject to the jurisdiction of the civil suits.

This whole idea seems like a nonstarter, impossible to implement, and with a laundry list of unintended consequences and is counterproductive to your stated goals of reduced propaganda. Instead of being primarily on centralized social media, fake news would be relegated to smaller fringe sites where it can’t be monitored as effectively as on the larger sites, further contributing to the echo chambers you argue against. Your intentions also seem antithetical to free speech between willing participants freely associating.


I think you just jumbled up a lot of counter-arguments here, some of which are interesting and others purely speculative. But let's start where you wound up: The speculation that it's "dangerous" to push fake news to smaller fringe sites hasn't been proven. I've seen a lot of damage arise as a result of the mainstreaming of conspiracy theories from fringe sites onto major sites. I don't see any evidence to support the idea that if the crazy grifters peddling those theories get shunted back to smaller sites, a significant portion of their audience will follow. The history of Facebook has been a pattern of people who have no idea how to find information being fed bad information. Most of those people probably won't find the fringe sites. They weren't there before. Their attention span is short. They're only dangerous as a herd.

The original argument against pushing extremists onto fringe sites was that the fringe sites were dark to law enforcement. I also don't buy that argument. It's a weak anti-encryption argument, and I also don't believe they're anywhere near as dark as LEOs claim. There might be some chatter that existed in the clear, but no one is currently plotting terrorist attacks out in the open on Facebook who will suddenly switch to Telegram if Facebook becomes party to civil suits.

Working back to your previous argument, the idea that the legal juggernauts of the big social networks will protect them while smaller sites self-censor is directly contradictory to what you say about fake news migrating to fringe sites. There's certainly less financial burden on a small site with less traffic to regulate what's posted, and as it stands, small sites do have to regulate what's posted, so it wouldn't be much more difficult. The only people who have broad exemptions over their liability for what's posted, currently, are the big social media sites. So let them spend that good money on their lawyers.


It’s not clear that smaller sites would have fewer lawsuits if the civil suits were to proceed, as they would be flooded with new users, and with new lawsuits, as soon as the large sites became hostile to them. There are already web3 social media sites that require login with a crypto wallet instead of a traditional user account. It wouldn’t be too difficult to create a site that would be immune to such lawsuits even on a small scale, so I’m not sure why you think that adding more lawyer paydays and violations of the right to freedom of speech and freedom to assemble are the answer. If the site doesn’t want to censor first amendment protected speech on their platform, but the government demands it, that’s a first amendment violation.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: