In November 2020, amidst increased conflict, violence and human rights abuses, Indonesia’s Ministry of Communication and Information Technology issued the Ministerial Regulation #5 (MR5). With the MR5 the Ministry has granted itself the authority to compel any individual, business entity or community that operates “electronic systems” (ESOs) to restrict or remove any content deemed to be in violation of Indonesia’s laws within 24 hours.
In February 2021, following the military coup that ousted elected leader Aung San Suu Kyi, Myanmar drafted a new “cybersecurity law” that will allow the government to order internet shutdowns, disrupt or block online services, ban service providers, intercept user accounts, access personal data of users and force the removal of any content on demand. The new law comes amidst intensifying pro-democracy protests and a violent crackdown on anti-coup protests that has resulted in deaths, mass arrests, and access to key social media networks or media outlets being restricted or blocked.
More recently, in the backdrop of a six-month-long farmers’ protest that has garnered both national and international support, the Indian government introduced the ‘Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules 2021′ (IT Rules 2021). The rules grant the government sweeping new powers over online content and users’ data as well as dramatically expand the obligations that intermediaries must meet in order to claim immunity.
The legislative efforts in Indonesia, Myanmar, and India mark the latest attempts by countries seeking to re-align control of digital communications and platforms with the jurisdictional boundaries of nation states. They are also manifestations of the disturbing global trend of countries forcing global platforms to accept local jurisdiction over their content and policies as a way to quell dissent and criticism. In this post, I delve into the details of the three new laws: tracing their origins, exploring their similarities and expanding on their unique characteristics. The question of global platforms and local laws is complicated, and studying various national approaches helps broaden our understanding of the challenges.
NetzDG: A template for online censorship
The antecedents of all three laws lies in Germany’s “Network Enforcement Act” or the NetzDG law. Adopted in 2017, the NetzDG forces social media platforms to take on a more active role in removing content that is prohibited under local laws. The law requires social media platforms to create a complaint tool for users to report content and takedown “obviously unlawful” content within 24 hours of it being flagged. For more complex content removal decisions, platforms have been granted 7 days extension after which a final decision must be reached, and unlawful content blocked or deleted. Non-compliance can result in penalties of upto 50 million euros. The law also includes a reporting obligation: all platforms that receive more than 100 complaints per calendar year about unlawful content must publish bi-annual reports detailing the actions taken by them against unlawful content.
In the four years of its existence, the German law appears to have become both the justification and the inspiration for states seeking to expand their control over online speech and digital platforms. Many of the countries introducing or discussing laws inspired by the NetzDG do not have the same rule of law as Germany or place strict restrictions on online information. Different governments have adopted different approaches; some have imposed obligations around specific types of content or online harms while others have sought to address a broader range of unlawful content.
Types of Content Being Restricted
The MR5 defines unlawful information as anything that is prohibited under Indonesia’s laws and regulations. The Indian rules require intermediaries to remove content that is “threatening the security or sovereignty”, friendly relations with foreign states, or public order”, “causing incitement to an offence or prevents investigation of any offence”, “defamatory”, “pornographic”, “paedophilic”, “racially or ethnically objectionable” etc., Like the German law, many of the categories of speech included under the MR5 and the rules are punishable under existing Indonesian and Indian laws and both laws impose only an enforcement obligation on intermediaries.
While this is true to some extent, the types of unlawful content covered by the Indian and Indonesian laws are so broad and open-ended that enforcement can and will result in censorship. For example, the MR5 requires intermediaries to restrict content deemed to be creating “community anxiety” or “disturbance in public order”. Similarly, the Indian rules require platforms to restrict content that is “obscene”, “harmful to minors”, “invasive of another’s privacy including bodily privacy”, and “insulting or harassing on the basis of gender”.
In addition to forcing platforms to remove content prohibited under local laws, Myanmar’s cybersecurity law requires service providers to remove statements “against any existing law” and any content that is “inappropriate to Myanmar’s culture” or can “cause hatred, destroy unity and tranquillity”. Myanmar also requires intermediaries to takedown lawful but problematic content like “untruthful news or rumours”, “misinformation”, and “disinformation”.
Obligations for Content Removal, Data Access and Storage
The MR5 compels everyone whose digital content is used or accessed within Indonesia to appoint a local point of contact to respond to content removal or data access orders.
ESOs are expected to respond to requests for removal of content within 24 hours and within 4 hours for “urgent” requests that relate to terrorism, child pornography, or content causing “unsettling situations for the public and disturbing public order.” If an ESO does not comply with removal requests within the time limit, it may face warnings and fines that accumulate every 24 or 4 hours (maximum of 3 times). If no action is taken against flagged content for more than 72 hours or 12 hours for urgent cases, ESOs could have their services blocked in Indonesia. The MR5 also includes obligations for ESOs to assist law enforcement agencies in their monitoring and enforcement process. Non-compliance could result in ESOs facing warnings, temporary or permanent blocking of services or having their registration revoked.
In India, the rules compel intermediaries to set up a grievance redressal mechanism for users to flag unlawful content and appoint grievance officers to resolve user complaints within a period of one month. Intermediaries must remove or restrict access to unlawful content within 36 hours and take action against non-consensually transmitted content that is sexual in nature (for e.g., revenge porn) within 24 hours. Intermediaries are also mandated to provide information or “any assistance” to “government agencies authorised for investigative, protective or cyber security activities” within 72 hours of receiving a legal order. Additionally, they must retain removed content and associated records for 180 days for investigation purposes.
Like the NetzDG, the Indian rules distinguish between intermediaries based on their size: social media intermediaries with 5 million registered users are classified as ’significant’ and face additional obligations. In addition to grievance officers, significant social media intermediaries are required to appoint India-based nodal contact persons “for 24×7 coordination with law enforcement agencies”. They must also appoint compliance officers who will be held liable if due diligence requirements are not met and publish compliance reports detailing action taken unlawful content.
Myanmar’s cybersecurity bill sets forth data localisation requirements for ISPs and TSPs who are required to retain all user data, including IP address, physical address, and ID number for 3 years and store it “at a place designated by” the government. In addition to penalties for posting unlawful content, “illegal” or “unauthorized” access to online material, “extracting, copying, downloading or destroying any data” or failure to provide authorities with access to data when requested “under any existing law” can earn officials at non-complying companies up to 3 years in prison.
Monitoring and Filtering Obligations
The rules in India go beyond content removal obligations provided under the NetzDG and require significant social media intermediaries to proactively monitor all content on their platforms to ensure it complies with local laws. The requirement for large social media platforms to deploy automated filtering tools or other mechanisms to proactively identify and remove unlawful content will likely incentivize the over-removal of hosted content.
The monitoring obligation also creates a catch 22 situation for significant social media intermediaries. The conditional immunity regime that applies to intermediaries in India is built on the understanding that intermediaries do not actively moderate content and therefore, lack the knowledge or awareness of unlawful content. So far, intermediaries were required to take action against unlawful content only after being notified by a court or government order. The new rules alter that arrangement and require significant social media intermediaries to undertake proactive filtering or be held liable for unlawful content on their platforms. On the other hand, if they do undertake proactive filtering they will no longer be able to qualify for safe harbor protections granted to intermediaries that do not actively moderate content. Therefore, these monitoring obligations are not only impossible to comply with but also opens up platforms to prosecution simply for being unable to do the impossible.
The MR5 mandates all ESOs (except cloud providers) to ensure their platforms, services and apps do not contain or facilitate dissemination of information prohibited under Indonesian laws. With this obligation ESOs must proactively monitor and filter all content on their platforms and services or have their services blocked. As noted by the Electronic Frontier Foundation (EFF), since the Ministry gets to determine what information is “prohibited” ESOs would be “hard-pressed to proactively ensure their system does not contain unlawful content or facilitates its dissemination even before a specific takedown”. In addition to proactive monitoring, ESOs that operate platforms or services that host user generated content must comply with any monitoring and data access obligations specified by the Indonesian government to qualify for safe harbor protections.
The rules in India include provisions that introduce new powers for courts and the government to order large firms offering encrypted messaging services to trace the “first originator” of the information on their platform. In order to comply with this obligation, some messaging platforms like Signal that encrypt metadata will have to alter the way they currently operate as they do not collect enough personal information to make identification possible. Allowing law enforcement agencies to demand traceability under Section 69 of the IT Act without a judicial order massively expands the surveillance powers and reach of an executive that recently argued that, “the veil of privacy can be lifted for legitimate state interest.”
Traceability orders are limited to serious offences and cases where less intrusive means are not available. Despite these limitations, the rules include such broad and open-ended grounds for restricting speech that the traceability provision is likely to give rise to many demands. Similarly, while intermediaries are not required to disclose the contents of the message or any other information related to the first originator or their users, the traceability requirement coupled with other data retention and access laws can grant the government unprecedented access to data about users. Problematically, the rules note that where the first originator of any information on the computer resource of an intermediary is located outside the territory of India, anyone “within the territory of India” who retransmits that information shall be deemed to be the first originator of the information. Therefore, under the new guidelines merely the act of sharing a tweet or video created by others that can be deemed illegal can land users in trouble.
The MR5 requires ESOs dealing with user-generated content to enable access to their “system” for monitoring by law enforcement agencies and disclose details of users who uploaded unlawful content. It is not clear how this obligation will translate for encrypted messaging platforms operating in Indonesia.
Expanding Government Control of Platforms
Myanmar’s cybersecurity law allows government authorities to conduct unspecified “interventions” for a broad range of reasons, including public order, investigating crime, and “safeguarding public life, property and public welfare.” Under the law, the cybersecurity ministry with approval from the military junta can temporarily prohibit any online service, shut down communication networks, seize and control devices, and ban any online service provider in Myanmar. Representatives of the junta can also “visit and check and oversee” the premises of online service providers at any time. If passed, the cybersecurity bill will end up consolidating the junta’s ability to conduct pervasive surveillance, curtail online expression, and cut off access to essential services.
Forcing intermediaries to appoint local officers to handle content removal or data access orders, coupled with threats of criminal prosecution enable the Indian government to apply pressure on moderation decisions and extend its reach over user data. Similarly, the Indonesian law has created the role of the “Minister for Access Blocking” to coordinate and process requests for blocking unlawful content from various government entities, law enforcement agencies, or individual “concerned members of the public”. The ministerial post allows the Indonesian government to insert itself into the content moderation process.
The expansion of government control over digital platforms under these laws is especially worrying viewed against the regional trend of countries introducing stringent information controls. On 16 February, the Cambodian government issued a sub-decree establishing the National Internet Gateway (NIG) or a single gateway to route all domestic and international internet traffic. The gateway will enable the authoritarian regime to monitor or restrict access to the internet at will and comes amidst a harsh crackdown on online speech that has seen citizens and political opponents being threatened, harassed, arrested, and jailed for their use of the internet.
Although there is no specified time-frame for its launch, the NIG will be managed by government-appointed operators. NIG operators are mandated to support “relevant authorities” with “measures to prevent and disconnect all network connections that affect national revenue, security, social order, morality, culture, traditions and customs.” They are also required to store all connection and traffic metadata for 12 months and submit regular reports to authorities. All infrastructure, network, telecommunications or internet operators must re-route their networks through the gateway within 12 months of its launch or face heavy penalties like the suspension of their operating licenses or freezing of their bank accounts. The law requires service providers to undertake registration of users to ensure they can accurately identified, making it easier for the government to monitor what its citizens post and share online.
Where do we go from here?
As I have shown above countries are enacting NetzDG inspired legislation to force online platforms to prioritize local laws in their content moderation and enforcement policies. As countries like Poland, Hungary and Canada consider regulation of online speech, legislative efforts in Indonesia, Myanmar, and India highlight the detrimental effects that forcing platforms to comply with local laws can have. The broad scope these laws and the restrictive content removal obligations included under them enables governments to tighten their grip over online content. The monitoring obligations alter the conditional safe harbour regime that applies to intermediaries and will pose a major compliance challenge for them while the traceability obligation impinges on the security and privacy of encrypted communications.
The development of these laws serve as an important reminder that the regulatory environment and media freedom in a particular country shapes platform regulation. Countries that seek to quash dissent and criticism will use regulation to turn platforms into tools of censorship. Although these legislative efforts stem from long-standing internet governance challenges but also raise important new questions. Are we headed for a future where global digital platforms and services dissolve into national chapters serving the legal requirements and contextual requirements of countries they operate in? What happens when the quality of national legal frameworks and due process vary across different countries? What happens when national laws do not comply with international human rights standards? Will there be a clash of content moderation standards?
Given the long-term and unanticipated consequences of requiring platforms to comply with legal frameworks in every country, countries like India, Myanmar and Indonesia need to think carefully about their approach to regulating online speech. Addressing harmful online content through multilateral approaches should be explored as an alternative to passing laws placing private companies in the position of setting the contours of online speech.