There’s an assumption making the rounds that a privacy law is the real answer to the data governance problems posed by Tiktok and other social media. The logic is pretty simple: Banning TikTok won’t keep us safe. A privacy law will. This argument was articulated by Julia Angwin in a New York Times op-ed, reiterated by Rep. Ocasio-Cortez in her first Tiktok video, and repeated by a growing number of people on the left end of the spectrum.
And therein lies the rub. The call for a privacy law is at its root an argument against a digital market economy. It is based on the assumption that tech giants are “preying on Americans’ data” (Angwin), and that all data brokers and data exchanges are harmful. A “privacy law” is seen as a magic wand that can be waved to “fix” all issues related to data governance by somehow turning the clock back to a pre-digital world in which no one can gather or sell data (a world that never really existed). It ignores the fact that we live in a world in which everyone has a digital device and is connected to a ubiquitous digital infrastructure that by its very nature generates digital data about everything society does. Instead of talking about data governance – i.e., how best to manage and control these data resources – it elevates an undefined notion of “privacy” to the forefront.
Our current understanding of a “privacy law,” in other words, is based on a complete misunderstanding of the political economy of data. A privacy law will not eliminate markets for digitized data, nor will it end the building of infrastructures and applications that generate data. If it really tried to do that, in fact, we would quickly discover that no one really wants that; it would have massive negative effects on most users of online services, and on society generally.
Navigating the vast amount of information resources on the Internet would be a hopeless mess without search engines. But how do search engines work? They exploit massive aggregations of user behavior data to match search queries to information resources. The control of spam in messaging and telephone calls relies heavily on processing lots of data about the origin and characteristics of messages. Those are only two very simple examples of thousands of ways aggregating data is useful for a variety of social purposes. There is nothing inherently wrong with a trading market for data resources. While there are certainly abuses and manipulative activity, as there is in any market, and while user rights need to be better defined, properly institutionalized data markets can increase the quality and accessibility of data resources.
Could the advocates of this tremendous “fix” tell us more about what this privacy law will do? Currently, tens of thousands of applications and services are free to the user because users make an in-kind bargain with those service providers: I get free services, you get to sell my eyeballs to advertisers. This kind of a bargain has been around at least since the advent of free over-the-air broadcasting in the 1920s. True, the digital economy takes it to new lengths. But that bargain isn’t going away. The GDPR, for example, is a strong privacy law which many U.S. advocates hold up as a model. But social media in Europe is not all that different from social media in the U.S. The GDPR simply inserted stronger consent mechanisms into the transaction. This was on net good, albeit costly, but most users will continue to accede to this data-for-free service bargain. There are still data brokers in Europe.
China passed a comprehensive privacy law in 2021, the Personal Information Privacy Law (PIPL). It is modeled on the GDPR. Did this make everyone in China “safe?” China’s platform economy marches on, led by giants like TenCent, Alibaba and JD, and China’s state can still surveill anyone it wants.
“Privacy laws” either involve more elaborate consent mechanisms, or expanded restrictions on the collection and exchange of data. A privacy law of the sort promoted by progressives only restricts and borders off the flow of digital data among private actors. But such restrictions do not eliminate the generation of that data by our constant interactions with the digital infrastructure. And they do not eliminate the value of the data. So, data will continue to be generated and it will continue to be valuable – to businesses, public agencies, scientists, policy makers, law-enforcement and public health. As long as the data is generated, and as long as that data is valuable, there are going to be powerful incentives to harvest it and some kind of a market demand for it. We should not just be talking about privacy. We should be talking about data governance and the broader political economy of the digital ecosystem.
Some data should be protected, true. But some of it should be unprotected, open free. Think of weather data or census data. The trick will be to protect individuals’ control over their digital identity – from government as well as private actors – while still allowing the infrastructure to capture, aggregate and exploit the social value of that data. We need data about society to make it work better, and properly institutionalized market incentives have proven to be the best way to encourage efficient and innovative technologies and uses. Until Congress and American policy makers develop much richer and more forward-looking ideas about data governance in the digital economy a “privacy law” is not going to do much good, and may do harm.