Shoshana Zuboff is professor emerita at Harvard Business School and the author of The Age of Surveillance Capitalism (2019), a landmark work arguing that the dominant logic of digital platform capitalism is the extraction and monetization of human behavioral data to predict and modify behavior at scale. Her framework named and analyzed a phenomenon that had been widely felt but rarely theorized at book length, and it became one of the most influential accounts of what tech platforms actually do and why.
The surveillance capitalism framework
Zuboff's central claim is that companies like Google and Facebook do not merely collect data — they harvest behavioral surplus, data beyond what is needed to improve services, and use it to build predictive products sold to advertisers. The ultimate ambition, in her account, is not just prediction but behavioral modification: nudging human action toward outcomes that maximize advertiser and platform revenue. This makes surveillance capitalism not merely a privacy violation but an assault on human autonomy and the epistemic foundations of democratic society.
The framework was widely adopted in tech criticism, policy discussion, and journalism. It gave a name and analytical structure to concerns that had circulated diffusely, and it connected platform data practices to broader questions about freedom, agency, and democracy.
Doctorow's rebuttal: monopoly, not mind control
Doctorow's 2020 essay how-to-destroy-surveillance-capitalism is explicitly framed as a response to Zuboff. Doctorow's critique operates on two levels. First, he argues that Zuboff overestimates the behavioral manipulation capabilities of platform advertising — that the empirical evidence for targeted advertising actually changing behavior at scale is weak, and that treating platforms as uniquely powerful behavior-modification engines grants them a mystique they have not earned and may not deserve.
Second, and more fundamentally, Doctorow argues that the real mechanism of harm is not mind control but monopoly. Platforms like Google and Facebook are dangerous not because their advertising is supernaturally persuasive but because they have used market power, adversarial-interoperability suppression, switching-costs, and regulatory capture to make themselves impossible to leave and impossible to compete with. The solution is therefore enshittification reversed through structural intervention — antitrust enforcement, interoperability mandates, and breaking up chokepoint power — not primarily privacy regulation or advertising restrictions.
This is more than a tactical disagreement. For Doctorow, framing the problem as behavioral manipulation leads toward the wrong policy solutions: content moderation mandates, advertising restrictions, and privacy regimes that large platforms can comply with (at great expense, which entrenches their dominance) while smaller competitors cannot. The chokepoint-capitalism framework, by contrast, points toward structural remedies that address the underlying concentration of power.
Points of agreement and coalition tensions
Doctorow does not dismiss Zuboff's concerns — he shares her view that surveillance capitalism is a genuine problem requiring serious response. The disagreement is diagnostic and therefore strategic. Both are critics of the same platforms; they differ on the mechanism that makes those platforms harmful and therefore on what would actually fix the problem.
This tension is representative of a broader divide in tech criticism between privacy-and-manipulation frameworks (associated with Zuboff, Shoshana Zuboff, and much of European regulatory thinking, including GDPR) and structural-monopoly frameworks (associated with lina-khan, matt-stoller, tim-wu, and Doctorow's own circle). The two camps are not enemies — they often vote together on specific policy questions — but the disagreement about root cause matters for which battles to prioritize and which legislative victories actually count as wins.