September 16, 2021
App Stores’ Market Power Challenged
Legal and regulatory actions are beginning to target the economic leverage of Apple’s and Google’s mobile app stores, but the cases make it clear that the conflict is primarily among competing suppliers rather than between the platforms and their users. The two App stores enjoy a duopoly derived from their incompatible mobile operating systems. Their operation of the app stores gives app developers access to billions of customers, but gives Apple and Alphabet the ability to select or reject which apps can be made available, as well as the ability to require approved apps to give them a cut of their revenues – up to 30% of the sales generated by the app. (There are differences in the platforms; Apple makes it technically difficult to “sideload” apps from sources other than its own App Store, whereas Android allows users to change their security settings to “Allow installation of apps from sources other than the Play Store.”)
Some major players in the software ecosystem, including wannabe competing platforms, see this as a bottleneck. South Korea’s two big Internet companies, Naver and Kakao pushed its National Assembly to pass a law August 31 requiring Google and Apple to open their app stores to external payment systems. This would allow app developers to establish their own in-app payment systems. Smaller game developers, however, may lack the staff or resources to create a separate payment system to replace those offered by app-store operators, a spokesman for the Korea Association of Game Industry said in the Wall Street Journal.
In the U.S., a court decision in the Epic Games vs. Apple antitrust lawsuit made an important contribution to this debate. It rejected Epic’s claim that Apple was a monopolist, but agreed that its anti-steering provisions, which prevent app owners hosted on its store from notifying customers of alternative payment arrangements, constituted “anti-competitive conduct.” The court decision contains a fascinating account of Epic Games’s “Project Liberty,” an organized campaign to attack Apple’s and Google’s software distribution and payment apparatuses, complete with a PR campaign and a stealth “hotfix” in Fortnite, in order to clear a path for Epic’s own alternative platform. In a finding of fact that undermines some of the more extreme claims about app stores’ monopoly status, the court defined the relevant market as “digital mobile gaming transactions.” Apple wanted the market to be defined more broadly as the entire gaming industry, which would include the console platforms of Microsoft, Sony and Nintendo; Epic wanted it to be defined more narrowly as Apple’s own internal app store economy, which would make Apple an obvious monopolist. Epic plans to appeal the ruling.
These decisions are the earliest results in what is likely to be a years-long encounter between platform app stores and legal and regulatory efforts targeting their market power. The conflict replays a common trade-off between security and competition in the tech industry. Platforms use their leverage as intermediaries to support their platform financially and vet apps for security, privacy and harm, whereas more open systems allow anyone to enter, generating more competition but possibly posing consumer protection problems and sustainability issues for the operators. Alphabet claims that its ability to make the Android OS available for free is financed in part by app store commissions.
Apple Postpones iPhone User Scanning
Apple’s plan to install surveillance software that will conduct on-device scanning in Messages and photos was stalled, but not stopped, by public opposition. On September 3, Apple issued a press release postponing the surveillance update. Its statement said “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.” The wording of this notice indicates that Apple is still committed to making the changes but wants to “improve” them.
The plan clearly undermined Apple’s attempt to establish itself as a leader in privacy protection. Thousands reacted negatively, and the Global Encryption Coalition sponsored a protest letter that obtained the support of 90 civil society organizations. The strong reaction led to some soul-searching but the company remains under intense pressure from governments and child safety groups to execute the scans.
There is one huge missing link in the discussion of child sexual abuse images, however: we have been unable to find any statistics correlating the number of images detected with either arrests of child predators or reductions in the amount of crime. Indeed, this correlation – so critical to trade off between privacy and crime – is not even part of the conversation yet.
WhatsApp e2ee monitoring regime and consent
ProPublica published an expose on WhatsApp’s content monitoring regime, highlighting two approaches it uses to moderate content on the platform. Conducting interviews with former employees, they outlined how the platform reviews private communications meta-data, and uses reporting of content by the receiving party, with a copy of the decrypted content sent to WhatsApp for review and ToS enforcement.
The story, which seemingly challenged the popular narrative of WhatsApp as a secure e2ee communications service, saw various negative reactions. Some pro-encryption advocates were saying it “is all unfair, misrepresentative and harmful to the cause of adopting end-to-end encryption”. Technical experts (correctly) insisted that WhatsApp is indeed e2ee between the sender and receiver; what happens after a communication is received, however, is another matter.
Recent research on content-dependent (i.e., automated content scanning) and content-oblivious (i.e., monitoring metadata and user reporting) moderation systems, which studied WhatsApp among other providers, found that user reporting is employed by more providers than any other single technique. Furthermore, user reporting was ranked more useful than any other option for three quarters of abuse categories, including harassment, hate speech, self-harm, IP infringement, phishing/malware, mis/disinformation, terrorism, porn, and bots. However, automated content scanning, reliant on matching image hashes provided by third parties, was ranked by far the most useful technique for detecting child sexual abuse images.
While we now know that user reporting is an important trust and safety technique used by platforms, how it is used raises questions. Should user reporting be considered a “content-oblivious” technique given that ostensibly private content is ultimately shared with the service provider for review? Do laws governing consent to record private communications apply to WhatsApp’s content reporting and moderation regime? How does the law governing consent interact with the service provider’s Terms of Service?