e-flux Conversations has been closed to new contributions and will remain online as an archive. Check out our new platform for short-form writing, e-flux Notes.

e-flux conversations

Waking Up to the Facebook Catastrophe

In the current issue of Bookforum, Ava Kofman reviews Zucked: Waking Up to the Facebook Catastrophe by Roger McNamee, an early investor in Facebook who has since become a vocal critic of the company. Among other things, the book details the privacy scandals that have dogged Facebook in recent years; these scandals have convinced McNamee that the platform isn’t just lax about security, but entirely premised on its flagrant violation. But this crucial insight about Facebook’s business model, which also applies to companies like Google and Amazon, fails to cure McNamee of his technophilia; as Korman points out, he remains in thrall to the self-satisfied ideology of Silicon Valley. Here’s an excerpt from the review:

Part memoir, part indictment, Zucked chronicles Facebook’s history to demonstrate that its practices of “invasive surveillance, careless sharing of private data, and behavior modification in pursuit of unprecedented scale and influence,” far from being a series of accidental oversights, were in fact foundational to the company’s astronomical success. This historical approach allows McNamee to draw valuable connections between present-day troubles and the company’s philosophical source code, outlining, for instance, how Cambridge Analytica’s malfeasance was rooted in Facebook’s decision, almost a decade ago, to allow third-party developers to harvest information about our friends.

McNamee spends much of Zucked inveighing against Facebook’s business model, which preys on our emotions to expose us to targeted advertising. Our behavior on the platform may feel organic, but it’s actually orchestrated far upstream, by what is arguably the world’s most influential artificial intelligence. The goal of Facebook’s AI, as the company’s engineers have essentially admitted, is to keep us on the site for as long as possible. To do so, it shows us content that resembles whatever most engaged us in the past: usually, what has made us angry, fearful, outraged, or some combination of the three. The more engaged we are, the more we’re oozing valuable personal data and looking at ads. Even worse, McNamee emphasizes, is that it’s not just products that Facebook is selling us—it’s our own warped and turbocharged ideologies. Join one conspiracy group and Facebook might suggest joining another. Hate speech is more contagious than photos of puppies or babies, and the filter bubbles that envelop us encourage its spread.

Image of Mark Zuckerberg via Bookforum.