Hate Online Censorship? It's Way Worse Than You Think.
A new red army is here: Widespread secret suppression scales thought police to levels not seen since the days of Nazis and Communists, and it is time to speak up about it.
Social media services frequently suppress your content without telling you. Even experts on shadowbanning do not have the full story. As Elon Musk recently put it, “there’s a massive amount of secret suppression going on at Facebook, Google, Instagram… It’s just nonstop.” He’s right.
For example, when a YouTube channel removes a comment, the comment remains visible to its author as if nobody intervened:
The most pernicious censorship is the kind we don’t discover.
Social media services employ thousands and enable legions more to secretly suppress your content. Some people volunteer to “moderate” for over 10 hours per day. Even users themselves unknowingly participate when they click “report,” since reports often automatically suppress content without notifying authors. Volunteering to moderate in itself is not harmful, but secretly removing “disinformation” is not doing your civic duty.
The most pernicious censorship is the kind we don’t discover. Services who refer to themselves as “a true marketplace of ideas,” and places to express “beliefs without barriers,” have really been growing a new red army aimed directly at free speech principles. But your voice can stop them: Trustworthy services do not secretly suppress their users.
How it works
Savvy internet users already know of the shadowban, where a service hides a user’s content from everyone except that person. However, services can also shadow remove individual comments. You might receive replies to some content while other commentary appears to fall flat. Such lonely, childless comments may truly have been uninteresting, or they may have been secretly suppressed. Moreover, many moderators suppress significantly more content than they admit.
Reddit had an estimated 74,260 moderators in 2017, according to the developer of Reddit’s most popular moderation tool, the Moderator Toolbox. Crucially, all moderator-removed comments on Reddit work the same way as YouTube’s.
As a result, removed content appears in the recent history of over 50% of Reddit users, and most users are still in the dark. Reveddit, a free site I built to show people their secretly removed Reddit content, sees over 300,000 monthly active users. Yet that is still only a fraction of Reddit’s 430 million.
On Facebook, users can “hide comment.” TikTok can make content “visible to self.” Truth Social does it, and there is even an MIT Press textbook from 2011 that promotes “several ways to disguise a gag or ban.” Twitter also continues to action content without disclosure. Such actions are unlikely to be discovered by low-profile accounts since they lack the followers to alert them.
Social media services attract volunteer moderators with ease. After all, the weapons of secret suppression are “free.” One only needs to commit their time. Then they can influence discussions of guns, abortion, gender, or even preferred political candidates. Plus, a former Reddit CEO says everyone does it. So why shouldn’t we all?
Trolls and bots benefit when services secretly suppress users.
In contrast to what the Internet’s red army says, trolls and bots benefit when services secretly suppress users. Trolls and bots represent persistent actors who are far more likely to discover and use secretive tooling. Trolls justify their own mischief by the existence of mischief elsewhere, a sort of eye-for-an-eye mentality. And bots are just automated trolls or advertisers, none of whom secrecy fools. Yet secretive services overwhelmingly dupe good-faith users.
If anyone best understood the harms of secret suppression, Aleksandr Solzhenitsyn did. He survived the Soviet Union’s forced labor camps where millions died. In his epic The Gulag Archipelago, he argued that the Soviet security apparatus based all of its arrests on Article 58, which forbid “anti-Soviet agitation and propaganda.” Police relied on this anti-free speech code to imprison whomever they wished on false charges. Article 58 thus removed truth and transparency from the justice system.
Similarly, many of today’s moderators favor advanced enforcement tools that rely upon secrecy. These tools enable a forum’s leaders to conjure up an agreeable consensus for their audience. That “consensus” makes the group appear strong and popular. The popularity then attracts more subscribers, both supporters and critics alike. Moderators must work overtime to maintain a copacetic appearance; otherwise, they risk losing what they built. Finally, subscribers who challenge the secrecy are suppressed by group leaders, and group leaders discourage members from promoting ideas shared by outsiders.
This may all be normal behavior as people strive to influence each other. However, new communications tech always presents a unique challenge: Early adopters wield significantly more influence than late entrants. Then, power players like evangelists and “tyrants” can enslave in ways that were thought to exist only in history books.
One could thank these early power players. The harms were going to appear sooner or later anyway, and their efforts help us discover the harms sooner.
But we have a role to play too. Left unchallenged, a censorial Internet enslaves both users and moderators alike to become ideological warriors. Ironically, we have promoted communications systems that do not foster good communication. But good communication is essential to well-functioning communities, so we must do something. The question is what to do or say. Unfortunately, the easy route of framing the problem as “us versus them” would merely strengthen existing power players.
Even Solzhenitsyn could not draw a clear line between his oppressors and the oppressed. Instead he wrote, “the line dividing good and evil cuts through the heart of every human being.” Put another way, we are each equally vulnerable, and we must guard against the temptation to control things we cannot. True progress comes when we acknowledge each of us is capable of wrongdoing.
These days, services de-amplify you for being disagreeable. We have not yet gotten to the heart of why we silo ourselves online. Nor do we understand why protesters increasingly resort to the heckler’s veto violence, or why we must replicate services to keep people friendly.
Social media sites claim to be our new public square. However, secret suppression represents indecision, not leadership. It is an attempt to appease all parties that reduces everything to chaos.
Trustworthy communication services do not secretly suppress their users.
Fortunately, chaos is inherently weak against truth. Article 58 required complete secrecy about conditions in the Gulags as well as an army of subdued enforcers. Solzhenitsyn left a warning:
We have to condemn publicly the very idea that some people have the right to repress others. In keeping silent about evil, in burying it so deep within us that no sign of it appears on the surface, we are implanting it, and it will rise up a thousandfold in the future.
We must publicly condemn secret suppression and repel defenses of it. A moderator of r/AgainstHateSubreddits recently argued that my censorship-revealing website "enables interference with moderation of the site." In other words, he does not want you to know when he censors you. I pushed back and he ran out of things to say. I seek out these conversations all the time. Those who cannot coherently argue may try to otherwise discredit you. But the truth is on your side.
We must also push back against the popular viewpoint that some cases require secret suppression. That is the status quo. Without exception, trustworthy communication services do not secretly suppress their users.
A well-known disinformation researcher, Renée DiResta, has argued more than once that you can both “inform users” and still secretly suppress them, and I’ve seen moderators declare the same. But hiding and disclosing information are inherently contradictory. Still, such misrepresentations can become popular “wisdom” if we fail to challenge them.
Secret suppression divides us. The average person does not imagine it exists. Similar to how my young daughter consumes bread, the practice eats away at the middle. It leaves behind disconnected fringes, both within ourselves and in wider society. In this vacuous environment, propaganda thrives.
While the Soviet Union got “all worked up,” as Solzhenitsyn put it, about what took place in Nazi Germany, it refused to address its own abuses. That would be “digging up the past.” Sadly, one can say the same of the West today. We regularly criticize censorship in Russia and China while ignoring our own faults. And it proves unsatisfying: we still fight amongst ourselves!
Solzhenitsyn’s book demolished the moral credibility of Communism in the West, according to Dr. Jordan B. Peterson. We can do the same to secretive services simply by telling the truth about secret suppression. When people violate the rules, teach them. With transparent moderation, users become part of the solution and have more respect for moderators.
We need not take up arms against each other. Instead, we can redress our own issues by talking about them. “Truth has value just because it’s true,” says Johnny Sanders, a licensed professional counselor. The truth is reliable, so let’s use it.
Today I discovered by accident that I was automatically unsubscribed from a bunch of Substacks that I had subscribed to recently, including yours. Not sure if it was a glitch, but I'm glad to be following your research and anti-censorship work again. Thanks for writing this.
I'm glad to hear people talking about this, I thought I was crazy the first time I noticed on Reddit that some comments I could see were invisible to everyone else. Since then I've been shouting into the void for the past year looking for someone else who is concerned.
Invisible moderation removes the possibility of accountability. Removing accountability makes abuse inevitable.
Worse, since humans are social creatures we are strongly influenced by our communities. If everyone around you believes something happened & no-one disputes it (except maybe one crazy who is easily disputed) you will believe it too. Shadow moderation allows people to craft this artificial consensus & can absolutely be used to manipulate individuals & groups.
Moderators & tech companies absolutely have the power to shape your experience of an issue, which will shape your understanding of said issue, which will shape your opinion of said issue & finally your actions, from voting to terrorism.