Thursday, April 12, 2018

Civil society and platform curation

One of many things that have been made more public during this week's congressional hearings with Mark Zuckerberg is the way in which the platform curates content. Zuckerberg bemoaned the reality that it's his job to decide who sees what when.
For those who study curation and platforms and internet law this is not new. I'm writing this while listening to Tarleton Gillespie discuss his forthcoming book (recommended) Custodians of the Internet. He's describing the rules, technologies, and people that make up the "moderation apparatus" - the systems that determine who sees what information, when, and from whom. Gillespies argues that this moderation is essential to what the platforms do - it is their value proposition. This runs counter to the longstanding mythos of the open web.

One of the elements of this "moderation apparatus" that Gillespie describes that catches my eye is the role of civil society organizations and nonprofits. Big companies, like Facebook but probably not only Facebook, rely on civil society to do their dirty work. 

In Myanmar, civil society groups that were working with Facebook to take down hateful and violent postings pushed back when Zuckerberg claimed that the company was doing all it could to address these issues. The civil society groups noted that the company was essentially relying on them to voluntarily moderate the site and wasn't providing them with the engineering resources that were needed to do this. They secured a verbal commitment from Zuckerberg to improve the process.

Here's what this means:
  • Facebook was shifting its responsibilities to civil society.
  • Civil society groups aren't equipped for, or paid for, this role. 
  • Civil society groups - by design - are fragmented and contentious. Choosing some of them to do moderation is a value-laden, editorial decision.  
  • Civil society is - from Facebook's perspective in this example - just a low cost, outsourced labor source.  It also, no doubt, shifts liability from Facebook to civil society (not least for the human psychological effects of moderating photos and posts about harm and violence).
Here's what I want to know:
  • How widespread are these kinds of commercial/civil society moderation/curation relationships?
  • How do they work - who's contracted for what? who's liable for what? what recourse exists when things go wrong?
  • What do civil society groups think of this? When might it be a good solution, from civil society's perspective?
  • Some civil society groups - such as Muslim Advocates and Color Of Change - are calling for a civil rights audit of Facebook. Senator Cory Booker took this idea into the hearings. This sort of advocacy and accountability demands of the platforms makes more sense to me as the role of civil society - not doing the work, but demanding the work be done. Your thoughts?
Seems to me this starts to elicit some really interesting questions about role/relationship of nonprofits, companies and government in digital space.


No comments: