Rightwing Twitter alternative Parler has become an important test case of the impact on free expression of service denials by private sector intermediaries. Parler’s status as a refuge for ultra-rightwing voices should not get in the way of an objective, long term appraisal of how social media content moderation and de-platforming affect all forms of political speech. Parler could just as well be an antifa or radical BLM advocacy platform; the point is that it gave expression to extreme points of view, not all of which were illegal.
Initially, civil liberties advocates (including us) were shocked at the speed and coordination with which Parler was taken down and Trump de-platformed. Now that the dust has settled, however, the charge of “private sector censorship” looks overstated. Indeed, the overall functioning of the legal and policy regime governing US platforms ends up looking pretty good in this case – especially when compared to the alternatives.
Parler will find its way back online
Parler was removed from both Google and Apple app stores in the immediate aftermath of the Capitol riots, and then was completely shut down when Amazon (AWS) withdrew its cloud hosting services. The Parler takedown did have elements of unfairness. Parler cannot be assigned primary responsibility for planning and coordinating the riots; as Parler’s website asserts, “competing platforms [note: Facebook] clearly were used for that purpose.” (If you want to know why Parler faced severe repercussions and Facebook and Twitter did not, see our previous blog post on the relationship between political power and the subject/object of content suppression.)
Parler did, however, harbor some very objectionable racist and violent content and had been warned by Amazon about it well before the riots. With legitimate fears of organized insurrection in the air, and the other major platforms cracking down on the type of people Parler appeals to, it was not unreasonable for a private actor to deny Parler use of its facilities lest it become a haven for organizing insurrection.
Parler sued Amazon to stop the shutdown but, on January 21, lost its request for a preliminary injunction. Parler’s lawsuit can be described as an attempt to force Amazon to host it. This would have violated both Amazon’s contractual/property rights and its political rights (freedom to dis-associate from what it considers to be objectionable speech or speakers).
Nevertheless, Parler is vowing to struggle back on line. It’s CEO says he intends to “reestablish an alternative social-media platform that not only avoids the unnecessary censorship that has become increasingly popular with its competitors, but is also more protective of public safety.” That is fine by us. It’s another voice, or set of voices, and its operators now recognize a need to moderate calls for violence. In the meantime, the Parler.com domain was transferred from DreamHost to the registrar Epik, whose CEO Robert Monster has a history of supporting the online presence of the far right. Apple CEO Tim Cook said it could restore Parler to the app store if it gets its content moderation act together. And Parler has secured a Russian company to protect it from denial of service attacks.
Three key conclusions can be drawn from this episode.
1. The major platforms are not “the Internet”
Being kicked off Amazon’s cloud and shunned by all of Silicon Valley does not mean being kicked off the internet. Silicon Valley is a powerful gatekeeper but it does not have total control over infrastructural access. Ownership diversity and market competition mean that despite the powerful impact of shunning Parler, there are still suppliers for it to turn to. Yes, de-platforming does raise major obstacles to access and operations, but that kind of freedom to dis-associate is also part of freedom of speech and association. It is not equivalent to state action. In a liberal democracy, no one has a right to compel others to provide them with an audience.
2. Section 230 works
Social media are being simultaneously blamed for having too much control over speech and for letting speech be completely out of control. Both sides of this divide fail to appreciate the way immunities and distributed private actor responsibilities walked a fine line between control and freedom.
The Section 230-based legal regime is a way of reconciling free political expression with the need to limit or remove certain kinds of speech from the public sphere. A potential threat to public safety was addressed, but not in a rigid, permanent and coercive way, not through state action, but through contractual arrangements and private ordering. The response is distributed, flexible and ongoing, just as social media content is. And if the response was too harsh, as it inevitably will be in some cases, there were still spaces where the suppressed entity could regroup and try to grow again. A state actor-driven response is going to be a lot more binary and a lot less correctible in cases of excess. Passing a law would impose a single, uniform standard, which would inevitably be applied in a way that would serve the interests of those in power and marginalize challengers.
3. International anarchy has its upside
Third, the use of a Russian supplier shows once again that geopolitical rivalry among nation-states, which in most cases impedes and fragments the internet, can in some cases actually help to support diversity in viewpoints, as long as the internet remains globally connected. The authority of the state in the international arena is limited. In this one sense, the anarchy among nation-states creates a space for dissident ideas (as it did in the case of Edward Snowden). National security types, predictably, view this as a problem or a threat, but that is a feature not a bug, as it limits their power. If one is truly concerned about monopoly power over speech, surely the monopoly of a state is the one to be most concerned about.
What are the alternatives?
Those unhappy with the Section 230-based response to the Capitol riots need to realistically weigh the alternatives.
Direct state regulation of content, fairness doctrines forcing “both sides” to be treated “equally,” common carrier regulations – all of these alternatives have downsides that far exceed the limitations and problems associated with the Section 230 regime. Many would be unconstitutional in the U.S., and all of them would be gamed by political actors to handicap less powerful opponents, and thoroughly abused by governments that are authoritarian or unconstrained by a first amendment. Yochai Benkler’s call for an alternative, government-run internet modeled on the post office shows that critics of the Section 230 regime are pretty much at wit’s end.
Looked at carefully, the Parler incident illustrates the resilience and flexibility of the Section 230 regime. It is not a cause for changing it or throwing it out.
Hmm. You’re being very generous about the options available to Parler. Most analysts would conclude that Parler is screwed; and to the extent it isn’t, that’s entirely dependent on the judgement of a small number of private actors. And the regulatory alternatives to s230 aren’t limited to those you mention. Why no mention of the EU Digital Services Act or UK Online Harms law – both intended to deliver more accountability and due process without removing liability limitations.
“International anarchy has its upside?” — “Success story of the Section 230 regime?”
This is article is disgusting. Clearly you, your family and your business have never faced censorship and persecution at the levels you try to articulate.
How is the President of Iran and Russia allowed on Twitter and the former President of the US banned on every platform? A sad, pathetic double standard is ruining freedoms in America and across the Internet. You don’t even mention a word about that.
This is the time we need unbiased Internet governance most and people like you try to justify how the discrimination occurring with section 230 is somehow good. There are murders and rapists with Twitter accounts but not the former President or hundreds of thousands of other conservative Americans who have been “cancelled.”
If you can’t comment or even acknowledge the political persecution and censorship occurring to conservative and non-conforming voices on the Internet you do not need to be writing on this topic for an organization like this.
Riots in liberal cities have gone on for years in America and the only thing the media can remember is the Capitol Riots. Not even a word about Portland, Seattle, SF or any of the other cities where hundreds of millions of dollars in damage were done. BLM riots are okay?
As a citizen of Georgia, supporting member of Georgia Tech and the mission of this organization, I am disturbed this was published.
I agree 100% This is outrageous! I have been spat at for saying the words “I don’t agree with Biden”. They claim they support free speech! This is true only when they get to choose who gets to talk.
So you really think a point of view can be illegal?
Hardly objective “ultra- right wing” seems to include anyone more conservative than Joe Biden or Mitt Romney (that’s about half the country). Actually, the new “Ministry of Truth”, including the MSM, major “social platforms” and parts of the governments (fed, state and local) acting in concert is quite overwhelming for many who just seek either to express themselves of find broader sources of information than the MSM makes available. Why are violent groups such as BLM, Antifa or CPUSA given space but not a small business selling American flag-themed products? Why is an established left-wing extremist group like the discredited SPLC given veto power over defining “hate speech””
IMHO there is a better solution which is that each such “provider organization” be required to make a one-time irrevocable decision to be either (i) a platform with no control over or liability for content not provided by its employees; or (ii) an editor with legal liability for the content it provides (or fails to provide. That alternative creates a balance between the rights of “platforms” and those of “users”. Legal indemnity for those actually providing platforms for free speech and liability for those providing or controlling the content (whether as authors or as editors).
The power to force Facebook, Twitter, YouTube, etc. to allow posts by the nationalist right (civic nationalist or racial/ethnic nationalist) would also be the power to force Stormfront to host interracial porn. Mandating access at the top level of the stack is problematic; but perhaps some sort of anti-discrimination law could be used to prevent a private consensus from deplatforming sites at the hosting/DNS/CDN level for politically offensive yet still legal speech.