back to

e-flux conversations

Algorithmic Governance Is Spreading


In the Boston Review, legal scholar Frank Pasquale reflects on the dangers of “algorithmic governance,” a model in which government uses surveillance, tracking, and peer rating systems to determine social benefits for its citizens. The best-known example of this is China’s Social Credit System (SCS), which is scheduled to be fully implemented by 2020, and which tracks everything from citizens’ internet activity to whether they are sufficiently deferential to their parents. As Pasquale observes, this model is spreading to other countries and other spheres, in particular education and health care. Here’s an excerpt:

Sadly, the rush to monitor and measure goes well beyond the SCS. A global educational technology industry has pushed for behavioristic testing and ranking of students, schools, and teachers. The same monitoring technologies may also dominate hospitals, nursing homes, and daycare facilities. Wherever there is “soft” care to be done, untrammeled by the Taylorist impulse to measure and manage, methods like the SCS may spread. Reputational currency is a way to rebrand repression as rational nudging.

Yet if we are worried about failures in this circuit—from children raised on YouTube videos to teachers who cheat to get an edge on high-stakes exams—the answer is not to double down on performance-based ranking systems designed to shame supposed shirkers. Rather, it lies in paying good wages to teachers and caregivers, merging the spheres of society and economy that schemes such as the SCS split. Leaders must put their money where their mouths are, rather than guilt-tripping or penalizing their subjects into a zero-sum rat race for reputational currency.