Richard Hill — The Curse of Concentration (Review of Cory Doctorow, How to Destroy Surveillance Capitalism)

Cory Doctorow, How to Destroy Surveillance Capitalism (One Zero, 2021)
Cory Doctorow, How to Destroy Surveillance Capitalism (One Zero, 2021)

a review of Cory Doctorow, How to Destroy Surveillance Capitalism (OneZero, 2021)

by Richard Hill


This short online (free access) book provides a highly readable, inspiring, and powerful complement to Shoshana Zuboff’s The Age of Surveillance Capitalism (which the author qualifies and to some extent criticizes) and Timothy Wu’s The Curse of Bigness. It could be sub-titled (paraphrasing Maistre) “every nation gets the economic system it deserves,” in this case a symbiosis of corporate surveillance and state surveillance, in an economy dominated by, and potentially controlled by, a handful of companies. As documented elsewhere, that symbiosis is not an accident or coincidence. As the author puts the matter: “We need to take down Big Tech, and to do that, we need to start by correctly identifying the problem.”

What follows is my analysis of the ideas of the book: it does not follow the order in which the ideas are presented in the book. In a nutshell, the author describes the source of the problem: an advertising-based revenue model that requires ever-increasing amounts of data, and thus ever-increasing concentration, coupled with weak anti-trust enforcement, and, worse, government actions that deliberately or inadvertently favor the power of dominant companies. The author describes (as have others) the negative effects this has had for privacy (which, as the author says, “is necessary for human progress”) and democracy; and proposes some solutions: strong antitrust, but also a relatively new idea – imposed interoperability. I will summarize these themes in the order given above.

However, I will first summarize four important observations that underpin the issues outlined above. The first is that the Internet (and information and communications technologies (ICT) in general) is everything. As the author puts it: “The upshot of this is that our best hope of solving the big coordination problems — climate change, inequality, etc. — is with free, fair, and open tech.”

The second is that data and information are increasingly important (see for example the Annex of this submission), and don’t fit well into existing private property regimes (see also here and here). And this in particular because of the way it is currently applied: “Big Tech has a funny relationship with information. When you’re generating information — anything from the location data streaming off your mobile device to the private messages you send to friends on a social network — it claims the rights to make unlimited use of that data. But when you have the audacity to turn the tables — to use a tool that blocks ads or slurps your waiting updates out of a social network and puts them in another app that lets you set your own priorities and suggestions or crawls their system to allow you to start a rival business — they claim that you’re stealing from them.”

The third is that the time has come to reject the notion that ICTs, the Internet, and the companies that dominate those industries (“Big Tech”) are somehow different from everything else and should be treated differently: “I think tech is just another industry, albeit one that grew up in the absence of real monopoly constraints. It may have been first, but it isn’t the worst nor will it be the last.”

The fourth is that network effects favor concentration: “A decentralization movement has tried to erode the dominance of Facebook and other Big Tech companies by fielding ‘indieweb’ alternatives – Mastodon as a Twitter alternative, Diaspora as a Facebook alternative, etc. – but these efforts have failed to attain any kind of liftoff. Fundamentally, each of these services is hamstrung by the same problem: every potential user for a Facebook or Twitter alternative has to convince all their friends to follow them to a decentralized web alternative in order to continue to realize the benefit of social media. For many of us, the only reason to have a Facebook account is that our friends have Facebook accounts, and the reason they have Facebook accounts is that we have Facebook accounts.”

Turning to the main ideas of the book, the first is that the current business model is based on advertising: “ad-driven Big Tech’s customers are advertisers, and what companies like Google and Facebook sell is their ability to convince you to buy stuff. Big Tech’s product is persuasion. The services — social media, search engines, maps, messaging, and more — are delivery systems for persuasion. Rather than finding ways to bypass our rational faculties, surveillance capitalists like Mark Zuckerberg mostly do one or more of three things: segment the market, attempt to deceive it, and exploit dominant positions.”

Regarding segmentation, the author states: “Facebook is tops for segmenting.” However, despite the fine targeting, its ads don’t always work: “The solution to Facebook’s ads only working one in a thousand times is for the company to try to increase how much time you spend on Facebook by a factor of a thousand. Rather than thinking of Facebook as a company that has figured out how to show you exactly the right ad in exactly the right way to get you to do what its advertisers want, think of it as a company that has figured out how to make you slog through an endless torrent of arguments even though they make you miserable, spending so much time on the site that it eventually shows you at least one ad that you respond to.”

Thus it practices a form of deception: “So Facebook has to gin up traffic by sidetracking its own forums: every time Facebook’s algorithm injects controversial materials – inflammatory political articles, conspiracy theories, outrage stories – into a group, it can hijack that group’s nominal purpose with its desultory discussions and supercharge those discussions by turning them into bitter, unproductive arguments that drag on and on. Facebook is optimized for engagement, not happiness, and it turns out that automated systems are pretty good at figuring out things that people will get angry about.”

The author describes how the current level of concentration is not due only to network effects and market forces. But also to “tactics that would have been prohibited under classical, pre-Ronald-Reagan antitrust enforcement standards.”

This is compounded by the current copyright regime: “If our concern is that markets cease to function when consumers can no longer make choices, then copyright locks should concern us at least as much as influence campaigns. An influence campaign might nudge you to buy a certain brand of phone; but the copyright locks on that phone absolutely determine where you get it serviced, which apps can run on it, and when you have to throw it away rather than fixing it. Copyright locks are a double whammy: they create bad security decisions that can’t be freely investigated or discussed.”

And it is due to inadequate government intervention: “Only the most extreme market ideologues think that markets can self-regulate without state oversight. Markets need watchdogs – regulators, lawmakers, and other elements of democratic control – to keep them honest. When these watchdogs sleep on the job, then markets cease to aggregate consumer choices because those choices are constrained by illegitimate and deceptive activities that companies are able to get away with because no one is holding them to account. Many of the harms of surveillance capitalism are the result of weak or nonexistent regulation. Those regulatory vacuums spring from the power of monopolists to resist stronger regulation and to tailor what regulation exists to permit their existing businesses.”

For example as the author documents, the penalties for leaking data are negligible, and “even the most ambitious privacy rules, such as the EU General Data Protection Regulation, fall far short of capturing the negative externalities of the platforms’ negligent over-collection and over-retention, and what penalties they do provide are not aggressively pursued by regulators.”

Yet we know that data will leak and can be used for identity theft with major consequences: “For example, attackers can use leaked username and password combinations to hijack whole fleets of commercial vehicles that have been fitted with anti-theft GPS trackers and immobilizers or to hijack baby monitors in order to terrorize toddlers with the audio tracks from pornography. Attackers use leaked data to trick phone companies into giving them your phone number, then they intercept SMS-based two-factor authentication codes in order to take over your email, bank account, and/or cryptocurrency wallets.”

But we should know what to do: “Antitrust is a market society’s steering wheel, the control of first resort to keep would-be masters of the universe in their lanes. But Bork and his cohort ripped out our steering wheel 40 years ago. The car is still barreling along, and so we’re yanking as hard as we can on all the other controls in the car as well as desperately flapping the doors and rolling the windows up and down in the hopes that one of these other controls can be repurposed to let us choose where we’re heading before we careen off a cliff. It’s like a 1960s science-fiction plot come to life: people stuck in a ‘generation ship,’ plying its way across the stars, a ship once piloted by their ancestors; and now, after a great cataclysm, the ship’s crew have forgotten that they’re in a ship at all and no longer remember where the control room is. Adrift, the ship is racing toward its extinction, and unless we can seize the controls and execute emergency course correction, we’re all headed for a fiery death in the heart of a sun.”

We know why nobody is in the control room: “The reason the world’s governments have been slow to create meaningful penalties for privacy breaches is that Big Tech’s concentration produces huge profits that can be used to lobby against those penalties – and Big Tech’s concentration means that the companies involved are able to arrive at a unified negotiating position that supercharges the lobbying.” Regarding lobbying, see for example here and here.

But it’s worse than lack of control: not only have governments failed to enforce antitrust laws, they have actively favored mass collection of data, for their own purposes: “Any hard limits on surveillance capitalism would hamstring the state’s own surveillance capability. … At least some of the states’ unwillingness to take meaningful action to curb surveillance should be attributed to this symbiotic relationship. There is no mass state surveillance without mass commercial surveillance. … Monopolism is key to the project of mass state surveillance. … A concentrated tech sector that works with authorities is a much more powerful ally in the project of mass state surveillance than a fragmented one composed of smaller actors.” The author documents how this is the case for Amazon’s Ring.

As the author says: “This mass surveillance project has been largely useless for fighting terrorism: the NSA can only point to a single minor success story in which it used its data collection program to foil an attempt by a U.S. resident to wire a few thousand dollars to an overseas terror group. It’s ineffective for much the same reason that commercial surveillance projects are largely ineffective at targeting advertising: The people who want to commit acts of terror, like people who want to buy a refrigerator, are extremely rare. If you’re trying to detect a phenomenon whose base rate is one in a million with an instrument whose accuracy is only 99%, then every true positive will come at the cost of 9,999 false positives.”

And the story gets worse and worse: “In the absence of a competitive market, lawmakers have resorted to assigning expensive, state-like duties to Big Tech firms, such as automatically filtering user contributions for copyright infringement or terrorist and extremist content or detecting and preventing harassment in real time or controlling access to sexual material. These measures put a floor under how small we can make Big Tech because only the very largest companies can afford the humans and automated filters needed to perform these duties. But that’s not the only way in which making platforms responsible for policing their users undermines competition. A platform that is expected to police its users’ conduct must prevent many vital adversarial interoperability techniques lest these subvert its policing measures.”

So we get into a vicious circle: “To the extent that we are willing to let Big Tech police itself – rather than making Big Tech small enough that users can leave bad platforms for better ones and small enough that a regulation that simply puts a platform out of business will not destroy billions of users’ access to their communities and data – we build the case that Big Tech should be able to block its competitors and make it easier for Big Tech to demand legal enforcement tools to ban and punish attempts at adversarial interoperability.”

And into a long-term conundrum: “Much of what we’re doing to tame Big Tech instead of breaking up the big companies also forecloses on the possibility of breaking them up later. Yet governments confronting all of these problems all inevitably converge on the same solution: deputize the Big Tech giants to police their users and render them liable for their users’ bad actions. The drive to force Big Tech to use automated filters to block everything from copyright infringement to sex-trafficking to violent extremism means that tech companies will have to allocate hundreds of millions to run these compliance systems.” Such rules “are not just death warrants for small, upstart competitors that might challenge Big Tech’s dominance but who lack the deep pockets of established incumbents to pay for all these automated systems. Worse still, these rules put a floor under how small we can hope to make Big Tech.”

The author documents how the curse of concentration is not restricted to ICTs and the Internet. For example: “the degradation of news products long precedes the advent of ad-supported online news. Long before newspapers were online, lax antitrust enforcement had opened the door for unprecedented waves of consolidation and roll-ups in newsrooms.” However, as others have documented in detail, the current Internet advertising model has weakened conventional media, with negative effects for democracy.

Given the author’s focus on weak antitrust enforcement as the root of the problems, it’s not surprising that he sees antitrust as a solution: “Today, we’re at a crossroads where we’re trying to figure out if we want to fix the Big Tech companies that dominate our internet or if we want to fix the internet itself by unshackling it from Big Tech’s stranglehold. We can’t do both, so we have to choose. If we’re going to break Big Tech’s death grip on our digital lives, we’re going to have to fight monopolies. I believe we are on the verge of a new “ecology” moment dedicated to combating monopolies. After all, tech isn’t the only concentrated industry nor is it even the most concentrated of industries. You can find partisans for trustbusting in every sector of the economy. … First we take Facebook, then we take AT&T/WarnerMedia.”

It may be hard to break up big tech, but it’s worth starting to work on it: “Getting people to care about monopolies will take technological interventions that help them to see what a world free from Big Tech might look like. … Getting people to care about monopolies will take technological interventions that help them to see what a world free from Big Tech might look like.”

In particular, the author stresses a relatively new idea: adversarial compatibility, that is, forced interoperability: “adversarial compatibility reverses the competitive advantage: If you were allowed to compete with Facebook by providing a tool that imported all your users’ waiting Facebook messages into an environment that competed on lines that Facebook couldn’t cross, like eliminating surveillance and ads, then Facebook would be at a huge disadvantage. It would have assembled all possible ex-Facebook users into a single, easy-to-find service; it would have educated them on how a Facebook-like service worked and what its potential benefits were; and it would have provided an easy means for disgruntled Facebook users to tell their friends where they might expect better treatment. Adversarial interoperability was once the norm and a key contributor to the dynamic, vibrant tech scene, but now it is stuck behind a thicket of laws and regulations that add legal risks to the tried-and-true tactics of adversarial interoperability. New rules and new interpretations of existing rules mean that a would-be adversarial interoperator needs to steer clear of claims under copyright, terms of service, trade secrecy, tortious interference, and patent.”

In conclusion: “Ultimately, we can try to fix Big Tech by making it responsible for bad acts by its users, or we can try to fix the internet by cutting Big Tech down to size. But we can’t do both. To replace today’s giant products with pluralistic protocols, we need to clear the legal thicket that prevents adversarial interoperability so that tomorrow’s nimble, personal, small-scale products can federate themselves with giants like Facebook, allowing the users who’ve left to continue to communicate with users who haven’t left yet, reaching tendrils over Facebook’s garden wall that Facebook’s trapped users can use to scale the walls and escape to the global, open web.”

In this context, it is important to stress the counter-productive effects of e-commerce proposals being negotiated, in secret, in trade negotiations (see also here and here). The author does not mention them, perhaps because they are sufficiently secret that he is not aware of them.


Richard Hill is President of the Association for Proper internet Governance, and was formerly a senior official at the International Telecommunication Union (ITU). He has been involved in internet governance issues since the inception of the internet and is now an activist in that area, speaking, publishing, and contributing to discussions in various forums. Among other works he is the author of The New International Telecommunication Regulations and the Internet: A Commentary and Legislative History (Springer, 2014). He writes frequently about internet governance issues for The b2o Review Digital Studies magazine.

Back to the essay


Please enter your comment!
Please enter your name here