“Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media”
Content moderation can serve as a prism for examining what platforms are, and how they subtly torque public life. Our understanding of platforms too blithely accepted the terms in which they were sold and celebrated – open, impartial, connective, progressive, transformative – skewing our study of social behavior that happens on them, stunting our examination of their societal impact.
Content moderation doesn’t fit this celebratory vision. As such, it has often been treated as peripheral to what they do—a custodial task, like sweeping up, occasional and invisible. What if moderation is in fact central to what platforms do? Moderation is an enormous part of the work of running a platform, in terms of people, time, and cost. The work of policing all this caustic content and abuse haunts platforms, and profoundly shapes how they work.
Today, social media platforms are being scrutinized in the press; specific controversies, each a tiny crisis of trust, have gelled into a more profound interrogation of their responsibilities to users and society. What are the implications of the emerging demand that platforms serve not as conduits or arbiters, but as custodians? This is uncharted territory for the platforms, a very different notion of how they should earn the trust of their users and stand accountable to civil society.