Bias, Skew and Search Engines Suffice to Explain Online Toxicity – with Cosma Shalizi

Henry Farrell and Cosma Shalizi (2024), “Bias, Skew and Search Engines Suffice to Explain Online Toxicity,” Communications of the ACM, preprint, 67,4:25-28.

U.S. political discourse seems to have fissioned into discrete bubbles, each reflecting its own
distorted image of the world. Many blame machine-learning algorithms that purportedly maximize
“engagement” — serving up content that keeps YouTube or Facebook users watching videos or
scrolling through their feeds — for radicalizing users or strengthening their partisanship. Sociologist
Shoshana Zuboff [15] even argues that “surveillance capitalism” uses optimized algorithmic feedback
for “automated behavioral modification” at scale, writing the “music” that users then “dance” to.

There is debate over whether such algorithms in fact maximize engagement (their objective
functions also typically contain other desiderata). More recent research [3] offers an alternative
explanation, suggesting that people consume this content because they want it, independent of
the algorithm. It is impossible to tell which is right, because we cannot readily distinguish the
consequences of machine learning from users’ pre-existing proclivities. How much demand comes
from algorithms that maximize on engagement or some other commercially valuable objective
function, and how much would persist if people got information some other way?

Even if we can’t answer this question in any definitive way, we need to do the best we can. There
are many possible interface technologies that can help organize vast distributed repositories of
knowledge and culture like the Web.

Read the full text in this preprint.

Other Writing:

Essay

Silicon Valley’s Reading List Reveals Its Political Ambitions

In 2008, Paul Graham mused about the cultural differences between great US cities. Three years earlier, Graham had co-founded Y Combinator, a “startup accelerator” that would come to epitomize Silicon Valley — and would move there in 2009. But at the time Graham was based in Cambridge, Massachusetts, which, as he saw it, sent a different message ...
Read Article
Essay

Facebook and Falsehood

After the election, many people blamed Facebook for spreading partisan — and largely pro-Trump — “fake news,” like Pope Francis’s endorsement of Trump, or Hillary Clinton’s secret life-threatening illness. The company was assailed for prioritizing user “engagement,” meaning that its algorithms probably favored juicy fake news over other kinds of stories. Those algorithms had taken ...
Read Article