Platform Cooperativism Resource Library


Content moderation is such a complex and laborious undertaking, it is amazing that it works at all and as well as it does. Moderation is hard. This should be obvious, but it is easily forgotten. Policing a major platform turns out to be a resource intensive and relentless undertaking; it requires making difficult and often untenable distinctions between the acceptable and the unacceptable; it is wholly unclear what the standards for moderation should be, especially on a global scale; and one failure can incur enough public outrage to overshadow a million quiet successes. And we as a society are partly to blame for having put platforms in this untenable situation by asking way too much of them. We sometimes decry the intrusions of moderators and sometimes decry their absence. Users probably should not expect platforms to be hands-off and expect them to solve problems perfectly and expect them to get with the times and expect them to be impartial and automatic.

Even so, we have handed over the power to set and enforce the boundaries of appropriate public speech to private companies. This enormous cultural power is held by a few deeply invested stakeholders, and it is being wielded behind closed doors, making it difficult for anyone else to inspect or challenge their decisions. Platforms frequently, and conspicuously, fail to live up to our expectations. In fact, given the enormity of the undertaking, most platforms’ own definition of success includes failing users on a regular basis.1 Social media companies have prof

Added May 6, 2020