Facebook’s oversight body: Huge step, baby step

Anne Collier
4 min readJan 30, 2019

--

This could almost be a sidebar to what I wrote just last week about the new middle layer of user care that’s organically developing for the new conditions of today’s media environment—a layer of care that’s independent of government and corporations and lies between “the cloud” and long-established care on the ground. But this news is way too big for a sidebar! [Disclosure: In addition to the trust & safety advisories of other social media platforms, I serve on Facebook’s Safety Advisory Board but was not briefed beforehand on this announcement. The ideas expressed here are entirely my own.]

Facebook’s graphic for its just-announced Oversight Board

What Nick Clegg, Facebook’s new VP of global affairs & communications, announced this week might give you pause. It should, actually. Because what’s new about it is not “only” the planning of an independent “Oversight Board” for social media content moderation decisions.

Consider the global oversight part. That’s the more profound part of this development. And it deserves some reflection—which it will get, as well as discussions “in Singapore, Delhi, Nairobi, Berlin, New York, Mexico City and many more cities” over the next six months.

Giant step + baby step

I don’t think I’ll ever forget hearing what Radiolab producer Simon Adler told On the Media’s Brooke Gladstone here about what he took away from his in-depth story, “Post No Evil,” about Facebook’s content moderation challenge:

“I know everyone wants to hate Facebook…but this is an intractable problem that they have to try to solve but that they never knew they were creating. And I walked away from this reporting feeling they will inevitably fail, but they have to try, and we should all be rooting for them.”

Maybe the failure part is true, under the existing conditions of our very social media environment. I think so. But this body is a baby step toward changing the conditions—moving at least some of the decisions about users’ content to an external body of decision makers who (hopefully) will have context on the issues involved. Maybe that will help. We don’t know yet, neither does Facebook, which is probably why it calls the charter for this new Board a “Draft Charter” and plans to take the discussion on the road. But this is the first truly holistic new solution of the new middle layer of social media user care that’s so far been discussed and built out in a fairly ad hoc way, country by country—mostly by governments and NGOs, not platforms.

Key questions

The biggest question, of course, is how to set up what amounts to a new global institution? Other key ones are: how does one body represent, much less serve, the whole world? How does it interface with governments? Does it only address issues not addressed by national laws? What kind of support staff does it need, is it part of Facebook or independent too, and — if the former — what part of Facebook does it “live” in and and what rules govern its work with this external body?

This is only the beginning of a fascinating new dimension and discussion of Internet safety — not to mention Internet governance — worldwide. It’s necessary, and courageous, pioneering work on Facebook’s part. It has some commonalities with the ideas of Prof. Gillian Hadfield and researcher Tarleton Gillespie I shared in my last post, and it’ll be interesting to see how those ideas and other experts’ will be folded in — to see how this concept evolves as Facebook learns from the discussions it plans to hold. Just setting up meaningful discussions with multiple perspectives and stakeholders speaking in many languages in the same room in each of those cities will have significant challenges.

Some predictions

It certainly won’t be easy for the company to find and then hold to the signal amid all the noise this process will trigger. As the discussion grows, I predict 3 things will happen (among undoubtedly many more):

  1. Cross-industry: People will want this kind of appeals process at other apps and platforms, so it necessarily becomes a cross-industry body that, for example, Twitter, Snap, Google (YouTube) Microsoft (LinkedIn and Xbox Live), possibly Amazon (Twitch), TenCent (with its own WeChat and many other apps/platforms it has invested in) will join and support.
  2. Simplification: Users will likely struggle to distinguish between content moderation (getting harmful content removed) and moderation appeals (appealing removal decisions) — between the work of “deletion centers” like Germany’s and the Oversight Board. They will naturally look for “one-stop shopping” where regulating content moderation’s concerned. So….
  3. Multiplication: For reasons of capacity, diversity, representation and practicality, governments and NGOs will ask Facebook to develop Oversight Boards (or morph current entities like those “deletion centers”) into regional ones.

In any case, as the Oversight Board is described now, we’ll immediately see some overlap between its work and the NGOs and governmental entities that already deal with social media content moderation. So it will be fascinating to watch how, together, national governments +international corporations + both national and international NGOs simplify the structure of user care worldwide.

Anne Collier is founder and executive director of The Net Safety Collaborative, home of the U.S.’s social media helpline for schools. She has been writing about youth and digital media at NetFamilyNews.org since before there were blogs and advising tech companies since 2009 (full-length bio here).

--

--

Anne Collier
Anne Collier

Written by Anne Collier

Youth advocate; blogger, NetFamilyNews.org; founder, The Net Safety Collaborative

No responses yet