A Chinese law that went into effect six months ago required online service providers to file details of the algorithms they use with China’s centralized regulator, the Cyberspace Administration of China (CAC). In mid-August, CAC released a list of 30 algorithms used by companies such as Alibaba, TenCent and Douyin, the Chinese version of Tiktok, with a brief description of their purpose. The move reflects similar trends in the US and Europe, where the idea of “regulating algorithms” is contained in proposed legislation in the US and in the EU.

The information publicly released by CAC is not much different from what Facebook already makes available about the algorithm governing users’ Feed and how YouTube describes its algo. But the CAC holds more detailed data and the ability to intervene in a top-down manner with a great deal of discretion.

The issue now is, what to do with this information?

Automated filtering and prioritization of content is an unavoidable aspect of large-scale digital content sharing platforms. The massive amount of content available, most of which is of little interest to any given user, would quickly overwhelm users. Algorithms thus provide a hyper-scale, automated version of the editorial function that traditional media used to do with human gatekeepers. The algorithms also serve the business purposes of the platforms (just as traditional editorial functions do), matching ads to targeted user-types, promoting engagement, and maintaining an environment that doesn’t offend or scare away users.

China’s experience may provide a useful test of the simplistic idea that nice, impartial government regulators can be better editors than profit-motivated businesses. While the CAC says that it will supervise algorithms to protect workers’ and consumers’ rights, or to prevent manipulating search results, manipulation of search results is one of the key features of China’s public opinion management regime, and the CAC has openly identified “safeguarding ideological security” or the promotion of “positive energy” as goals of its intervention. While not all the governments seeking to regulate algorithms are authoritarian, all of them are political in nature and the politicization of what algorithms do cannot possibly do anything but make the world a better place, we are sure.