The Moral Economy of High-Tech Modernism – with Marion Fourcade

This short piece compares 21st century machine learning to 19th and 20th century bureaucracy – we hope to write more.

While people in and around the tech industry debate whether algorithms are political at all, social scientists take the politics as a given, asking instead how this politics unfolds: how algorithms concretely govern. What we call “high-tech modernism”- the application of machine learning algorithms to organize our social, economic, and political life-has a dual logic. On the one hand, like traditional bureaucracy, it is an engine of classification, even if it categorizes people and things very differently. On the other, like the market, it provides a means of self-adjusting allocation, though its feedback loops work differently from the price system. Perhaps the most important consequence of high-tech modernism for the contemporary moral political economy is how it weaves hierarchy and data-gathering into the warp and woof of everyday life, replacing visible feedback loops with invisible ones, and suggesting that highly mediated outcomes are in fact the unmediated expression of people’s own true wishes.

Algorithms-especially machine learning algorithms-have become major social institutions. To paraphrase anthropologist Mary Douglas, algorithms “do the classifying.”1 They assemble and they sort-people, events, things. They distribute material opportunities and social prestige. But do they, like all artifacts, have a particular politics?2 Technologists defend themselves against the very notion, but a lively literature in philosophy, computer science, and law belies this naive view. Arcane technical debates rage around the translation of concepts such as fairness and democracy into code. For some, it is a matter of legal exposure. For others, it is about designing regulatory rules and verifying compliance. For a third group, it is about crafting hopeful political futures.

The questions from the social sciences are often different: How do algorithms concretely govern? How do they compare to other modes of governance, like bureaucracy or the market? How does their mediation shape moral intuitions, cultural representations, and political action? In other words, the social sciences worry not only about specific algorithmic outcomes, but also about the broad, society-wide consequences of the deployment of algorithmic regimes-systems of decision-making that rely heavily on computational processes running on large databases. These consequences are not easy to study or apprehend. This is not just because, like bureaucracies, algorithms are simultaneously rule-bound and secretive. Nor is it because, like markets, they are simultaneously empowering and manipulative. It is because they are a bit of both. Algorithms extend both the logic of hierarchy and the logic of competition. They are machines for making categories and applying them, much like traditional bureaucracy. And they are self-adjusting allocative machines, much like canonical markets.

Read the full article at MIT

Other Writing:

Essay

AI’s Big Rift is Like a Religious Schism

TWO CENTURIES ago Henri de Saint-Simon, a French utopian, proposed a new religion, worshipping the godlike force of progress, with Isaac Newton as its chief saint. He believed that humanity’s sole uniting interest, “the progress of the sciences”, should be directed by the “elect of humanity”, a 21-member “Council of Newton”. Friedrich Hayek, a 20th-century ...
Read Article
Essay

Count the Costs of Cutting Technological Ties with China

The result of all this is that policy discourse about the United States, China,and technology has careened from one pathology to another: The cheeryglobalism of a decade ago has given way to today’s diffuse paranoia. Nowthe national security conversation is almost exclusively focused on theimpossible task of severing the ties of technological interdependence,with the only ...
Read Article